Thursday, March 23, 2023

Ordine TSRM e PSTRP NA-AV-BN-CE

Federazione Nazionale Ordini TSRM e PSTRP

HomeVolume 3OnlineThe technostress questionnaire: a pilot study

The technostress questionnaire: a pilot study

Essay
Authors: Finstad Georgia Libera,Giorgi Gabriele
[jahc-pdf][/jahc-pdf]
Authors
Pub.Info
DOI
Pub.Info

Submission Date: 2021-10-25
Review Date: 2021-10-29
Pubblication Date: 2021-11-05

Introduction

From work-related stress to the concept of technostress

Work-related stress represents one of the greatest risks for safety and health at work with detrimental consequences for individuals and organizations (Hassard, Teoh, Visockaite, Dewe, & Cox, 2018; van der Molen, Nieuwenhuijsen, Frings-Dresen, & de Groene, 2020). The available data at the European level display an alarming picture in which 25% of workers work under pressure most of the time (Eurofound & EU-OSHA, 2014). The literature highlights several factors capable of eliciting a negative response defined as stressors, such as workload, the degree of flexibility, interpersonal relationships, the role within the organization, the boundaries between work and private life and the level of job insecurity, just to name a few (International Labour Organization (ILO), 2016). These characteristics are not static and change over time putting pressure on workers for successful adaptation. In this sense, the progressive digitalization of the world of work represents one of the greatest examples (International Labour Organization (ILO), 2018). In Europe alone, the ICT sector accounts for around 4% of gross domestic product (GDP) and is responsible for one third of the increase in overall industrial production (European Commission (EC), 2016). In this regard, the European Union is sponsoring several initiatives in order to increase digital competitiveness and lead to the creation of a Gigabit Society capable of guaranteeing Europe a leading position, as established in the Connectivity Targets (European Parliament & European Parliament Think Tank, 2017). On the one side, ICTs represent one of the major drivers of economic growth and collective evolution. The use of technology as a strategic variable breaks down communication barriers, improves processes, guarantees a continuous flow of data, contributes to innovation thanks to direct access to information and reduces costs (Tarafdar, Tu, Ragu-Nathan, & Ragu-Nathan, 2007; Wang, Shu, & Tu, 2008). On the other hand, in accordance with the new line of research focused on technostress, the use of technology modifies and / or further worsens the effect of traditional psychosocial risks while creating new sources of stress (EU-OSHA, 2018; La Torre, Esposito, Sciarra, & Chiappetta, 2019). In addition, the new COVID-19 pandemic has further accentuated the burden of using ICT, underlining the need for continuous monitoring of work-related factors. For example, data from the Italian Ministry of Labor and Social Policies indicate that in April 2020 there were 1,827,792 remote workers and of these, 1,606,617 had started working remotely after the COVID-19 outbreak (Ministero del Lavoro e delle Politiche Sociali, 2020). The term Technostress (TS) was introduced by Brod (Brod, 1984) to identify the “inability to adapt or cope with new computer technologies in a healthy manner”. To date, it is widely demonstrated that the negative effects of technostress involve symptomatic manifestations such as heart problems, increased blood pressure, increased cortisol levels, decreased heart rate variability (HRV), mood changes, immune system changes, burnout, job dissatisfaction, fatigue, decreased productivity, concentration problems, role stress, absenteeism and turnover (Ragu-Nathan, Tarafdar, Ragu-Nathan, & Tu, 2008; Riedl, Kindermann, Auinger, & Javor, 2012, 2013; Tarafdar et al., 2007).

The characteristics involved in the experience of technostress: technostressors and moderators

In order to analyze the phenomenon of technostress it is important to underline that it is not the technology itself that is stressful, but rather some aspects and characteristics relating to its use. These same characteristics are part of the theoretical models of the reference literature and form the basis of pre-existing psychometric tools, such as the transactional model of Ragu-Nathan et al., (Ragu-Nathan et al., 2008) the model based on the Person-Environment fit by Ayyagari et al., (Ayyagari, Grover, & Purvis, 2011) or the model by Salanova et al., (Salanova, Llorens, & Cifre, 2013) which is based on the RED framework (resources-demands-experiences). For example, the mere fact of having a large amount of data in real time and a perpetual flow of information creates the expectation of having to perform activities according to tight deadlines, thus generating high productivity expectations that workers are not always able to cope with. This leads to phenomena defined as “data smog” “information overload” or “Information Fatigue Syndrome (IFS)” (Lewis, 1996; Weil & Rosen, 1997). In this regard, the results of a research conducted in 5 countries suggest that 74% of managers are subjected to stress due to information overload (Klausegger, Sinkovics, & “Joy” Zou, 2007). The information often comes from different external and internal sources (e.g., e-mail, virtual applications, smartphones) and workers have to simultaneously perform tasks of different nature that result in “continuous partial attention” and excessive multitasking (Marulanda‐Carter & Jackson, 2012; Weil & Rosen, 1997). In this regard, Dabbish and Kraut (Dabbish & Kraut, 2006) use the term “email overload” to define the overload caused by receiving and sending emails. Similarly, the findings of Jackson and colleagues (Jackson, Dawson, & Wilson, 2003) suggest that employees feel compelled to respond to messages and therefore stop their current activities. The availability of information from different channels also implies a greater level of ambiguity about which tasks should be given priority. For example, the research of Tarafdar and colleagues (Tarafdar et al., 2007) conducted on a sample of 233 employees showed a direct association between technostress and role stress. Likewise, role ambiguity and role conflict can arise if specific learning opportunities are not provided (training dimension) and if the pace of change is too high (Ayyagari et al., 2011; La Torre et al., 2019). Evidence shows that this fast pace is a highly stressful experience that can lead to a form of “technology fatigue” (Day, Paquet, Scott, & Hambley, 2012; Sami & Pangannaiah, 2006; Weil & Rosen, 1997). There is in fact a discrepancy between the speed of changes and the adjustment process of individuals, so that employees are unable to keep up with the new versions released, the new systems implemented and the technologies used (Brod, 1984; Chilton, Hardgrave, & Armstrong, 2005). Furthermore, thanks to the use of technology, the spatial and temporal structure is much more fluid, individuals can work outside standard office hours and outside company premises, be available even on vacation, send and receive communications thanks to the use of smartphones, tablets and laptops. The need to be continuously available can result in the inability to interrupt activities and workaholic phenomena such as Inability to Switch Off (ITSO) and Fear Of Missing Out (FOMO), or the fear of losing job requests and important messages even if these do not fall within the established schedules (Gaudioso, 2015; Salanova et al., 2013). For example, research by Duxbury and colleagues (Duxbury, Higgins, Smart, & Stevenson, 2014) analyzed the use of BlackBerry smartphones in a sample of 25 workers through a longitudinal study. The results of the analyzes identified a group (13 subjects) defined as “struggling segmentors” who used their devices 24/7 and felt obliged to do so by their company. The constant use of ICT also involves a virtualization of social relations which are the basis of the entire socialization process that builds the sense of organizational identification (Liao, 2017; Rhoads, 2010; Wiesenfeld, Raghuram, & Garud, 2001). Communications mediated by electronic devices can decrease the ability of individuals to interact with others and their level of empathy (Lyon, 1985), creating a “depersonalized” work environment in which the construction of social sense is undermined (Ayyagari et al., 2011) and which is often associated with lower levels of trust between supervisor and employees, lower levels of job satisfaction, greater conflicts, a sense of isolation and greater levels of stress (Day et al., 2012; Golden, Veiga, & Dino, 2008). Other sources of technostress identified in the literature concern the electronic monitoring of employee activities and the level of reliability of systems and devices (e.g., crashes, malfunctions) (Ayyagari et al., 2011; Day et al., 2012; Hudiburg, 1995; McNall & Stanton, 2011). Despite the possible benefits, electronic monitoring can be negatively perceived by employees, leading to feelings of violation. Evidence shows that constant supervision is a highly stressful experience associated with less job satisfaction and less organizational commitment (Day et al., 2012; Wells, Moorman, & Werner, 2007) while in the event of malfunctions the worker is forced to perform the task again and develops the fear of future problems, leading to feelings of frustration and stress (Ayyagari et al., 2011; O’Driscoll, Brough, Timms, & Sawang, 2010). On a professional level, employees may consider their skills as inadequate with respect to technological innovations, leading to perceptions of unemployment and economic stressors. For example, the data suggests that 1/3 of currently existing jobs will be replaced by Smart Technology, Artificial Intelligence, Robotics and Algorithms (STARA) (Frey & Osborne, 2017) by 2025 and that we are living in “an era of technological unemployment” (Peters, 2017). A recent study (Brougham & Haar, 2020) conducted on a sample of 1516 workers in three countries (United States, Australia and New Zealand) showed that the perceived danger of technological destruction is positively associated with job insecurity and turnover intentions. The literature also highlights some factors that can moderate the negative effect of technology including: the level of training provided to cope with the changes introduced and the involvement of users in the design and implementation phases (Ragu-Nathan et al., 2008; Tarafdar et al., 2007; Tarafdar, Tu, & Ragu-Nathan, 2010), perceived usefulness and usability (part of the literature on the acceptance and adoption of technology) (Ayyagari et al., 2011; Davis, 1989) and the perceived ability in the use of ICT (construct of technological self-efficacy) (D. Compeau, Higgins, & Huff, 1999). In this regard, the research by Tarafdar and colleagues (Tarafdar et al., 2010) shows that employee involvement increases satisfaction in the use of technological systems and decreases technostress while employee training has a dual purpose: to increase the specific technical skills for a given technology and positively change attitudes and perceptions towards it (Marler, Liang, & Dulebohn, 2006). For example, several studies suggest that there is a negative association between technological skills and technostress levels and that employees with adequate training (e.g. at least 8 hours) show lower levels of stress, strain and dissatisfaction (Korunka & Vitouch, 1999; Ragu-Nathan et al., 2008). Regarding perceived usefulness and usability, following the P-E perspective, employees who do not perceive the usefulness of technologies and believe they can do their work in different ways may have a greater perception of the workload. Likewise, if employees view technologies as complex, any work related to the use of that technology will be perceived as more challenging (Ayyagari et al., 2011). Finally, analyzing the concept of self-efficacy, the evidence suggests an association with a positive attitude towards computers (Venkatesh & Davis, 1996), greater adaptation accompanied by less reluctance towards ICT-related changes (Ellen, Bearden, & Sharma, 1991) and greater motivation to persist (Deng, Doll, & Truong, 2004).

Development of the theoretical model

The main purpose of this research is to preliminarily test a new psychometric tool aimed at assessing stress related to ICT in the workplace. To achieve this goal and therefore to devise the dimensions and items of the questionnaire, it is necessary to extrapolate the crucial areas of the individual-technology relationship. This process can be accomplished by reviewing the literature and previous measurement tools (Streiner, Norman, & Cairney, 2015). Therefore, on the basis of a careful review of the available studies, the main characteristics of ICT involved in the experience of technostress were identified (recurring concepts in the literature). These characteristics were then grouped on the basis of their conceptual similarity and previous empirical results into 15 dimensions that form the sub-scales of the questionnaire. In addition, the Stress Questionnaire (Giorgi, Arcangeli, & Cupelli, 2013; Mucci et al., 2015) was used as a starting point for the transposition of traditional stressors in the context of digital stress. Table 1 shows the definitions of the different technological characteristics and the support received from previous research.

Area of investigationDefinitionSupport from pre-existing literature
UsefulnessThe employee’s perception of the usefulness of technologies in carrying out the job(Ayyagari et al., 2011; Brod, 1984; Davis, 1989; Sami & Pangannaiah, 2006; Weil & Rosen, 1997).
UsabilityThe level of ease perceived by the employee in learning and using technologies(Ayyagari et al., 2011; Brod, 1984; Davis, 1989; Sami & Pangannaiah, 2006; Weil & Rosen, 1997).
ReliabilityThe trust that the employee places in the correct functioning of technologies and in the absence of malfunctions(Ayyagari et al., 2011; Brod, 1984; Butler & Gray, 2006; Day et al., 2012; Day, Scott, & Kevin Kelloway, 2010; Hudiburg, 1995; O’Driscoll et al., 2010).
Technological self-efficacyThe degree of competence perceived by the employees in the use of technologies(D. Compeau et al., 1999; D. R. Compeau & Higgins, 1995a, 1995b; Salanova et al., 2013; Shu, Tu, & Wang, 2011).
RoleThe degree of clarity perceived by the employee about the responsibilities and duties related to the use of technologies in carrying out job tasks(Ayyagari et al., 2011; Giorgi, Arcangeli, & Cupelli, 2012; Ragu-Nathan et al., 2008; Rangarajan, Jones, & Chin, 2005; Tarafdar et al., 2007).
MultitaskingThe degree to which the use of technologies involves processing different information, following multiple tasks simultaneously and interrupting the main work activities(Gaudioso, 2015; Mark, Voida, & Cardello, 2012; Marulanda‐Carter & Jackson, 2012; Ragu-Nathan et al., 2008; Tarafdar et al., 2007, 2010; Weil & Rosen, 1997).
Job controlThe level of autonomy experienced in the use of technologies at work(Day et al., 2012, 2010; Giorgi et al., 2012; Karimikia, Singh, & Joseph, 2020; Kraan et al., 2014; O’Driscoll et al., 2010; Ragu-Nathan et al., 2008).  
Job demandsThe perception of increased pressure and workload due to the use of technology in the workplace(Chesley, 2010; Day et al., 2012, 2010; Klausegger et al., 2007; Lewis, 1996; Ragu-Nathan et al., 2008; Tarafdar et al., 2007; Weil & Rosen, 1997).  
Pace of changeThe perception of the speed of ICT-related changes and the consequent perception of not having adequate skills combined with the pressure for updates(Arnetz & Wiholm, 1997; Ayyagari et al., 2011; Brod, 1984; Chilton et al., 2005; Day et al., 2012, 2010; Sami & Pangannaiah, 2006; Weil & Rosen, 1997).   
Pervasiveness/Work-life balanceThe perception of always being connected to work even outside standard hours and blurred home-work boundaries due to the use of technologies(Ayyagari et al., 2011; Day et al., 2012, 2010; Duxbury et al., 2014; Jacukowicz & Merecz-Kot, 2020; Nam, 2014; Ragu-Nathan et al., 2008).
Privacy/monitoringThe perception that employees have of the traceability of their work activity due to ICT and the related compromise of privacy(Ayyagari et al., 2011; Day et al., 2012, 2010; McNall & Stanton, 2011; Weil & Rosen, 1997; Wells et al., 2007). 
EmployabilityThe degree to which employees perceive that they do not have adequate technology skills and think that their future job may be at risk(Ayyagari et al., 2011; Brougham & Haar, 2020; Garrido, Sullivan, & Gordon, 2010; Giorgi et al., 2012; Giorgi, Arcangeli, Mucci, & Cupelli, 2015; Korunka, Weiss, Huemer, & Karetta, 1995; Ragu-Nathan et al., 2008; Tarafdar et al., 2010).   
Supervisor supportThe degree to which the employee perceives the relationship with the supervisor as predominantly virtual, with fewer opportunities for face-to-face interactions, direct feedback and support(Day et al., 2012, 2010; Golden et al., 2008; Liao, 2017; Lyon, 1985; Rhoads, 2010; Staples, 2001; Vayre & Pignault, 2014; Wiesenfeld et al., 2001).
Colleague support The degree to which the employee perceives the relationship with colleagues as predominantly virtual, with fewer opportunities for faceto-face interactions, direct feedback and support(Day et al., 2012, 2010; Golden et al., 2008; Liao, 2017; Lyon, 1985; Rhoads, 2010; Staples, 2001, 2001; Vayre & Pignault, 2014; Wiesenfeld et al., 2001).
Involvement The degree to which employees receive information about the benefits of the technologies, the changes that will be implemented and the level of their involvement in the process(Brod, 1984; Day et al., 2012, 2010; Mckeen & Guimaraes, 1997; Parsons, Liden, O’Connor, & Nagao, 1991; Ragu-Nathan et al., 2008; Tarafdar et al., 2010). 
Training The level of training provided for the use of new technologies (e.g. software and hardware) together with the presence of specific opportunities and adequate time for learning(Beas & Salanova, 2006; Day et al., 2012, 2010; Marler et al., 2006; Ragu-Nathan et al., 2008; Tarafdar et al., 2007).
Tab. 1 – Areas of investigation, related definitions and support from pre-existing literature

Materials and methods 

Questionnaire design and item development

For the development of the questionnaire scales, the guidelines suggested by the literature (DeVellis, 2012; Streiner et al., 2015) were followed through the following steps: establish the purpose of the scales, create the respective items, choose the type of response format and evaluate content validity. The items (written in Italian) were generated with the aim of adequately representing the pivotal dimensions of technostress. Both positive and negative items were used in the construction of the scales. The items were then reverse-coded for statistical analysis purposes, thus obtaining correctly computable data. Regarding the response format, we used a 5-point Likert scale with the following response options: 1) totally disagree, 2) disagree, 3) neither agree nor disagree, 4) agree, 5) totally agree. This whole process led to the formulation of 82 items which were then subjected to critical analysis by a group of experts, academics and psychology students in order to evaluate content validity (Streiner et al., 2015). The aim was to evaluate the relevance of the items with respect to the constructs investigated. Furthermore, judgments were also collected about the linguistic aspect of the items (e.g. ease of understanding, ambiguity).

Data collection

The data come from a convenience sample consisting of two subsamples. The first sample (hereafter sample 1) includes 104 subjects while the second sample (hereafter sample 2) includes 131 subjects for a total of 235 participants. Sample 1 includes workers belonging to a company in the insurance sector. Data collection was conducted from January 1, 2020 to January 31, 2020 by means of self-administered paper questionnaires that were collected 30 days after their distribution. Participation took place on a voluntary and anonymous basis. Of the 194 questionnaires initially distributed, 104 questionnaires were found to be valid, thus showing a response rate of 53.6%. For sample 2, data collection was conducted from April 17, 2020 to August 4, 2020 through the Google Survey platform. Selection criteria included being over the age of 18 and being a worker (public or private sector, employed or self-employed). Participation took place on a voluntary and anonymous basis. The format included a first page in which the objectives of the research were explained and in which the information on the use of data for statistical/scientific purposes was made explicit. The informed consent of the participants was obtained through the same page. In order to complete the questionnaire, it was necessary to answer each item, hence there are no missing data. The following socio-demographic information were also collected for both samples: gender, age, type of contract and length of service.

Experimental design and statistical analysis

All analyzes were performed using the Statistical Package for the Social Sciences (SPSS) software version 20.0 (IBM, 2011). The analysis process followed various steps: missing data analysis, descriptive analysis, reliability analysis and factorial structure analysis. Missing data in sample 1 were filled in with the series mean and specific items were reverse-coded for the whole sample. After analyzing the characteristics of the sample (considered as categorical variables), reliability analyzes were performed for each subscale, grouping the items according to the dimensions conceptually traced a priori. This provided an initial idea of the performance of the scales (Parasuraman, Zeithaml, & Malhotra, 2005). Reliability tests were performed using Cronbach’s alpha coefficients as indicators of internal consistency (DeVellis, 2012).  (De Vellis, 2012). Desirable values vary between 0.70 and 0.90 although the literature also accepts values close to 0.60 (Hair, 2006).  Especially by virtue of the exploratory nature of the study, in this case we considered acceptable values close to 0.60. The “Cronbach’s alpha if item deleted” values and the item-total correlation were also analyzed. The values of the item-total correlation should be above 0.30 (Kline, 1986).  In addition, Pearson’s product-moment correlation matrices were examined for items with low correlations. Following this step, the items whose disappearance improved the reliability coefficients were eliminated and both the alpha coefficients and the correlation matrices were recalculated. Despite strong and elaborate theoretical premises, we continued by carrying out an exploratory factor analysis (EFA) to highlight without coercion the natural pattern of the data (Fabrigar, Wegener, MacCallum, & Strahan, 1999). EFA can be conducted to provide a basis for subsequent confirmatory factor analysis (CFA), both of which are based on the common factor model and analyzes can be conducted concurrently by randomly splitting the sample in two. However, given the relatively small sample size and exploratory nature of the study it was decided not to split the observations and not to perform the CFA at this stage. First, data were screened to analyze factorability, i.e. the presence of a substantial number of significant relationships between the items. Pearson’s product-moment correlation coefficient values should be at least 0.30 and no more than 0.80 to avoid multicollinearity (Field, 2009). In addition, values from Barlett’s test for sphericity and the Kaiser-Meyer-Olkin measure of sampling adequacy (KMO) were analyzed. For the Barlett’s test, values of the significance level lower than 0.05 indicate suitable data while for the KMO the minimum acceptable value is 0.50 (Kaiser, 1974; Tabachnick & Fidell, 1996). Regarding communalities, there is still no unanimous agreement on what is the degree of variance in common required to maintain a variable. For example, Fullagar (Fullagar, 1986) and Child (Child, 2006) suggest a threshold value of 0.20. To determine the number of factors to include, the criteria used were the Kaiser criterion with eigenvalues greater than 1 (Gorsuch, 1988) and the Scree test (Cattell, 1966).  Finally, the interpretation of the factors was based on the factor pattern matrix coefficients for which the pattern coefficients or “loadings” should be at least 0.30, the rule of at least three variables per factor and the critical consideration of possible underlying theoretical constructs (Field, 2009).

Results

The first qualitative analyzes on the content validity provided adequate results. The group of experts / academics evaluated the items as representative of the constructs and linguistically understandable. Following their judgments, two items (TQ81, TQ82) were eliminated for a total of 80 items that were administered to the sample of 235 subjects.

In the overall sample consisting of 235 subjects, males accounted for 47.7% while the percentage of females was 52.3%. Most of the subjects were under the age of 45 (59.4%) and had an open-ended contract (66.5%). As regards length of service, 59.1% of subjects have been working for more than 10 years. The demographic characteristics, stratified by sample, are presented in Table 2

Population CharacteristicsSample 1       N                   %Sample 2      N                    %Full Sample       N                 %
Gender     
Male   70                67.3   42                32.1   112             47.7
Female     34                32.7   89                67.9   123             52.3
Age       
< 45 years   45                 43.3   84                 64.1   129             54.9
45-60 years   45                 43.3   20                 15.3    65              27.7
>  60 years     14                 13.5   27                 20.6    41              17.4
Type of Contract     
Open-ended c.   94                 92.2   61                   6.6   155             66.5
Fixed-term c.    2                    2   12                   9.2     14               6
Collaboration    4                    3.9   21                 16     25             10.7       
Other      2                    2   37                 28.2     39             16.7
Length of service     
< 10 years  17                  16.3   79                 60.3     96             40.9
> 10 years    87                  83.7   52                 39.7    139             59.1
Tab. 2 – Characteristics of the study population stratified by sample

The results show Cronbach’s alpha values ranging from fair to excellent depending on the scale. In some cases, those items that allowed an improvement in the internal consistency value were eliminated, obtaining a range of reliability values that oscillates between α = 0.60 and α = 0.88 (usefulness/usability α = 0.79, reliability α = 0.66, technological self-efficacy α = 0.75, role α = 0.81, multitasking α = 61, job control α = 0.75, job demands α = 0.88, pace of change α = 0.60, pervasiveness/WLB α = 0.88, privacy/monitoring α = 0.66, employability α = 0.723, supervisor support α = 0.81, colleague support α = 0.80, employment α = 0.75, training α =0.87). All item-total correlation values exceeded the 0.30 threshold and inspection of the correlation matrices revealed that all coefficients were significant (p < 0.001; p < 0.05). The communalities were all above 0.20 except in the case of some values slightly below the threshold (TQ24 = .183, TQ2 = .168). For all scales, the KMO sample adequacy test showed values above the recommended threshold of 0.50 and the Barlett sphericity test was significant (p <0,001). Kaiser’s criterion of eigenvalues greater than 1 and the Scree plot test yielded a one-factor solution capable of explaining at least 50% of the variance with at least three variables with factor loadings ≥ 30 for all scales (usefulness/usability 55.36%, reliability 49.67%, technological self-efficacy 57.77%, role 64.00%, job control 66.95%, job demands 54.54%, pace of change 56.37%, pervasiveness/WLB 62.54%, privacy/monitoring 60.13%, employability 54.93%, supervisor support 57.20%, colleague support 56.21%, employment 67.08%, training 65.81%) except in the case of the multitasking scale for which the analyzes yielded a two-factor solution. Given the lack of 3 items with factor loadings ≥ 30 for factor 2, exploratory factor analyzes were re-run with only 3 items of factor 1. The results yielded a one-factor solution representing 56.69% of the variance.

Cronbach’s alpha values mean and standard deviation, and the results of the exploratory factor analysis for the sub-scales are shown in Table 3.2.

ScaleN° ItemsMeanSD Cronbach ‘s αEigenvaluesPercentage of variance explained
Usefulness/usability511.733.62.792.76855.36%
Reliability49.562.74.661.98749.67%
Technological self-efficacy410.743.34.752.31157.77%
Role49.483.00.812.56064.00%
Multitasking38.182.43.611.70156.69%
Job control38.582.43.752.00966.95%
Job demands823.596.58.884.36454.54%
Pace of change39.522.25.601.69156.37%
Pervasiveness/WLB515.384.50.853.12762.54%
Privacy/Monitoring39.752.47.661.80460.13%
Employability49.363.12.722.19754.93%
Supervisor support514.834.06.812.86057.20%
Colleague support514.394.12.842.81156.21%
Employability39.212.50.752.01267.08%
Training515.854.02.873.92165.81%
Tab. 3 – No. of items, mean, standard deviation, Cronbach’s alpha values, eigenvalues and percentage of variance explained for the sub-scales.
Note. N = 235. The extraction method was principal axis factoring with an oblique (Promax with Kaiser Normalization) rotation.

Discussion

ICTs are shaping our relationship with reality and this inevitably implies profound transformations in the world of work. The fourth industrial revolution has enabled companies to achieve numerous positive results: cost efficiency, real-time information, instant collaborations, uninterrupted data flow, improvement of business and strategic processes (International Labour Organization (ILO), 2018). As suggested by the EU-OSHA report (EU-OSHA, 2018) and other international bodies (Eurofound & EU-OSHA, 2014; International Labour Organization (ILO), 2018), some characteristics of ICTs may involve psychosocial, organizational and ergonomic risks if not properly managed. This means that technology partially modifies the traditional stressors (e.g. job demands, role conflict and ambiguity, employability) of organizational research, while at the same time being responsible for new stressors (La Torre et al., 2019). For this reason, the aim of this study was to investigate the key dimensions of technostress in light of work-related stress research in order to create and preliminarily evaluate a new questionnaire. In particular, the purpose of this exploratory study was to investigate the characteristics and properties of the psychometric instrument, i.e. the reliability and dimensionality of each scale. To achieve this, a literature review was conducted from the beginning of the work-related stress research tradition up to the spread and evolution of the concept of technostress starting from the 1980s, including description of technostressors, possible moderators and outcomes, the main previous theoretical models and pre-existing psychometric tools. Based on this process, recurring concepts within technostress research were mapped and grouped in 15 key dimensions representing the scales of the questionnaire. The items representative of each dimension were then generated according to a deductive path, paying particular attention to their characteristics. The results of the internal consistency analyzes with the respective Cronbach’s alpha values and the results of the exploratory factor analyzes are encouraging and provide a good starting point for a future in-depth evaluation of the tool. Indeed, each scale showed fair to excellent reliability, despite the pilot nature of the study. By specifically analyzing the results for each scale, overall satisfactory values are highlighted, albeit with slight differences. The results of the exploratory factor analysis for the “technological reliability” scale yielded a one-factor solution representing 49.67% of the variance. Although the factors must explain at least 50% of the variance in order to be maintained, this value appears to be only slightly below the threshold. Despite this, all four items had loadings greater than 0.30 and therefore the solution exceeded the recommended threshold of three items per factor. Future studies should investigate the structure of this particular scale to verify its properties. Only in the case of the multitasking scale, factor analysis yielded 2-factor solution, but factor 2 failed to meet the general requirement of at least 3 items with loadings greater than 0.30. For this reason, the factor was not maintained and the exploratory factor analysis was rerun with only the 3 items of factor 1. At this point, critical reasoning about the possible underlying constructs led to a number of considerations. Generally defined as carrying out several tasks and/or projects simultaneously (Appelbaum, Marchionni, & Fernandez, 2008), multitasking involves both the processing of different information and the continuous interruptions of primary activities. The items of factor 2 referred to “checking lots of different information” and “following multiple things at the same time” while items of factor 1 seem to refer more to frequent interruptions (e.g. “my activities are interrupted because of the use of technology” or “I am forced to neglect some tasks because new emails are always coming in”). For this reason, retained factor 1 could be named either more generally “multitasking” or “interruptions” to refer specifically to this aspect of digital multitasking. In conclusion, these results provide an interesting basis for subsequent studies. Furthermore, the results of exploratory factor analyzes supported the expected dimensionality (only the Multitasking scale yielded a two-factor structure). All the constructs analyzed contribute to the understanding of the phenomenon of technostress in the workplace, providing a specific interpretation of the implications of the technology. As already highlighted, the application of ICT has a significant impact on the entire production cycle, on the tools and equipment available, on the organization and management of work (e.g., human resources [HR] practices), on organizational structures, on hierarchical models and relationships, characteristics of the workforce (e.g. diversified and heterogeneous employees), knowledge and skills required. Workplaces are subject to intense changes from a technical (tasks and procedures) and social (management systems, roles and hierarchies) point of view (EU-OSHA, 2018; European Commission (EC), 2016). Analyzing the characteristics of the content and context of work in the perspective of digital stress is of fundamental importance for the future of work, especially considering the changes expected in the post-COVID-19 employment world (Buomprisco, Ricci, Perri, & De Sio, 2021). For example, in 2019 fewer than one in 20 workers reported teleworking on a regular basis, while according to Eurofound’s COVID-19 survey, one third of EU workers started working from home after the pandemic (Sostero, Milasi, Hurley, Fernandez-Macías, & Bisello, 2020). According to available forecasts, 40% of companies expect a hybrid and remote work model for their employees in the future, highlighting the need to analyze new occupational risks such as technostress (Boston Consulting Group (BCG), 2020). Indeed, hybrid work represents one of the main challenges for occupational health and safety (OSH) professionals in terms of ergonomic factors (e.g. musculoskeletal disorders [MSD], visual fatigue, sedentary behavior) and psychosocial risks (technostress) (Broughton & Battaglini, 2021; International Labour Organization (ILO), 2020).

Limitations and future directions

Despite all precautions, this study is not without limitations. First, this study was based on a convenience sample. Due to the current COVID-19 pandemic, no companies were found to participate in the research after January 2020. Future research could use a more adequate sampling strategy to overcome the limitations of this pilot study. Furthermore, the sample was relatively small, although it was an adequate size for pilot studies. Generally, 10/20% of the main sample can be considered adequate (Baker, 1994), however, the greater the size, the greater the precision of parameter estimates (Johanson & Brooks, 2010). The most important limit concerns the validation process itself, which for this study was limited to the analysis of reliability and the analysis of dimensionality through EFA. However, our results certainly provided an excellent basis for future research, which will have to employ confirmatory methods to better examine the factorial structure of the scales and analyze construct validity (discriminant and convergent validity) (Strauss & Smith, 2009). Furthermore, an interesting direction for subsequent research is the analysis of possible second order constructs, as in the Stress Questionnaire (Giorgi et al., 2013). 

Practical Implications  

The rise of ICT and the spread of technostress underline the importance of the possible practical implications of OSH research. Once validated, the questionnaire could be used for tailor-made organizational diagnoses capable of providing a precise picture of corporate health thanks to the analysis of the scales and socio-demographic information. Some of the strategies proposed by EU-OSHA (EU-OSHA, 2018) to prevent the negative consequences of technology include the development of a code of conduct on digitization, collaboration between academics, companies and governments to analyze the human aspect of technologies, the involvement of workers in the implementation of technologies and an “advanced workplace risk assessment” to identify potential threats to health and well-being. As for progress in terms of policies, the open discussion on the technostress phenomenon and the research carried out on the subject are bringing out a new legislative branch called the “right to disconnect” while already in 2007 the sentence of the judge Raffaele Guariniello of the Turin prosecutor’s office recognized technostress as an occupational disease following an investigation in call centers (Chiappetta, 2017). The evaluation of technostress is not yet explicitly contemplated by this decree even if the assessment of the “technostress risk” can and should be carried out in the light of this regulatory framework (Chiappetta, 2017; La Torre et al., 2019). Recently, Law number 61 of 6 May 2021, converting the Law Decree 30/2021 introduced an important change on the subject of the right to disconnect (previously seen as one of the clauses to be included in the individual smart working agreement) for dependent parents of children under sixteen (Gazzetta Ufficiale, 2021).

The hope is that organizational research can support this process and find maximum application in the drafting of policies, agreements and laws that protect every aspect of the health and well-being of workers, guaranteeing better conditions.

References

  1. Appelbaum, S. H., Marchionni, A., & Fernandez, A. (2008). The multi‐tasking paradox: Perceptions, problems and strategies. Management Decision, 46(9), 1313–1325. https://doi.org/10.1108/00251740810911966
  2. Arnetz, B. B., & Wiholm, C. (1997). Technological stress: Psychophysiological symptoms in modern offices. Journal of Psychosomatic Research, 43(1), 35–42. https://doi.org/10.1016/S0022-3999(97)00083-4
  3. Ayyagari, Grover, & Purvis. (2011). Technostress: Technological Antecedents and Implications. MIS Quarterly, 35(4), 831. https://doi.org/10.2307/41409963
  4. Baker, T. L. (1994). Doing social research (2nd ed). New York: McGraw-Hill.
  5. Beas, M. I., & Salanova, M. (2006). Self-efficacy beliefs, computer training and psychological well-being among information and communication technology workers. Computers in Human Behavior, 22(6), 1043–1058. https://doi.org/10.1016/j.chb.2004.03.027
  6. Boston Consulting Group (BCG). (2020, June 30). Remote Work Works—Where Do We Go from Here? Retrieved from https://www.bcg.com/it-it/publications/2020/remote-work-works-so-where-do-we-go-from-here
  7. Brod, C. (1984). Technostress: The human cost of the computer revolution. Reading, Mass: Addison-Wesley.
  8. Brougham, D., & Haar, J. (2020). Technological disruption and employment: The influence on job insecurity and turnover intentions: A multi-country study. Technological Forecasting and Social Change, 161, 120276. https://doi.org/10.1016/j.techfore.2020.120276
  9. Broughton, A., & Battaglini, M. (2021). Teleworking during the COVID-19 pandemic: Risks and prevention strategies.
  10. Buomprisco, G., Ricci, S., Perri, R., & De Sio, S. (2021). Health and Telework: New Challenges after COVID-19 Pandemic. European Journal of Environment and Public Health, 5(2), em0073. https://doi.org/10.21601/ejeph/9705
  11. Butler & Gray. (2006). Reliability, Mindfulness, and Information Systems. MIS Quarterly, 30(2), 211. https://doi.org/10.2307/25148728
  12. Cattell, R. B. (1966). The Scree Test For The Number Of Factors. Multivariate Behavioral Research, 1(2), 245–276. https://doi.org/10.1207/s15327906mbr0102_10
  13. Chesley, N. (2010). TECHNOLOGY USE AND EMPLOYEE ASSESSMENTS OF WORK EFFECTIVENESS, WORKLOAD, AND PACE OF LIFE. Information, Communication & Society, 13(4), 485–514. https://doi.org/10.1080/13691180903473806
  14. Chiappetta, M. (2017). The Technostress: Definition, symptoms and risk prevention. Senses and Sciences. https://doi.org/10.14616/sands-2017-1-358361
  15. Child, D. (2006). The essentials of factor analysis (3rd ed). London ; New York: Continuum.
  16. Chilton, M. A., Hardgrave, B. C., & Armstrong, D. J. (2005). Person-Job Cognitive Style Fit for Software Developers: The Effect on Strain and Performance. Journal of Management Information Systems, 22(2), 193–226. https://doi.org/10.1080/07421222.2005.11045849
  17. Compeau, D., Higgins, C. A., & Huff, S. (1999). Social Cognitive Theory and Individual Reactions to Computing Technology: A Longitudinal Study. MIS Quarterly, 23(2), 145. https://doi.org/10.2307/249749
  18. Compeau, D. R., & Higgins, C. A. (1995a). Application of Social Cognitive Theory to Training for Computer Skills. Information Systems Research, 6(2), 118–143. https://doi.org/10.1287/isre.6.2.118
  19. Compeau, D. R., & Higgins, C. A. (1995b). Computer Self-Efficacy: Development of a Measure and Initial Test. MIS Quarterly, 19(2), 189. https://doi.org/10.2307/249688
  20. Dabbish, L. A., & Kraut, R. E. (2006). Email overload at work: An analysis of factors associated with email strain. Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work  – CSCW ’06, 431. Banff, Alberta, Canada: ACM Press. https://doi.org/10.1145/1180875.1180941
  21. Davis, F. D. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319. https://doi.org/10.2307/249008
  22. Day, A., Paquet, S., Scott, N., & Hambley, L. (2012). Perceived information and communication technology (ICT) demands on employee outcomes: The moderating effect of organizational ICT support. Journal of Occupational Health Psychology, 17(4), 473–491. https://doi.org/10.1037/a0029837
  23. Day, A., Scott, N., & Kevin Kelloway, E. (2010). Information and communication technology: Implications for job stress and employee well-being. In P. L. Perrewé & D. C. Ganster (Eds.), Research in Occupational Stress and Well-being (Vol. 8, pp. 317–350). Emerald Group Publishing Limited. https://doi.org/10.1108/S1479-3555(2010)0000008011
  24. Deng, X., Doll, W., & Truong, D. (2004). Computer self-efficacy in an ongoing use context. Behaviour & Information Technology, 23(6), 395–412. https://doi.org/10.1080/01449290410001723454
  25. DeVellis, R. F. (2012). Scale development: Theory and applications (3rd ed). Thousand Oaks, Calif: SAGE.
  26. Duxbury, L., Higgins, C., Smart, R., & Stevenson, M. (2014). Mobile Technology and Boundary Permeability: Mobile Technology and Boundary Permeability. British Journal of Management, 25(3), 570–588. https://doi.org/10.1111/1467-8551.12027
  27. Ellen, P. S., Bearden, W. O., & Sharma, S. (1991). Resistance to technological innovations: An examination of the role of self-efficacy and performance satisfaction. Journal of the Academy of Marketing Science, 19(4), 297–307. https://doi.org/10.1007/BF02726504
  28. EU-OSHA. (2018). Foresight on new and emerging occupational safety and health risks associated with digitalisation by 2025. Retrieved from https://osha.europa.eu/en/publications/foresight-new-and-emerging-occupational-safety-and-health-risks-associated/view
  29. Eurofound, & EU-OSHA. (2014, October 13). Psychosocial risks in Europe: Prevalence and strategies for prevention. Retrieved from https://osha.europa.eu/en/publications/psychosocial-risks-europe-prevalence-and-strategies-prevention/view
  30. European Commission (EC). (2016). Digitising European Industry Reaping the full benefits of a Digital Single Market. Retrieved from https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52016DC0180
  31. European Parliament & European Parliament Think Tank. (2017). Towards a European gigabit society Connectivity targets and 5G. Retrieved from https://www.europarl.europa.eu/RegData/etudes/BRIE/2017/603979/EPRS_BRI(2017)603979_EN.pdf
  32. Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research. Psychological Methods, 4(3), 272–299. https://doi.org/10.1037/1082-989X.4.3.272
  33. Field, A. P. (2009). Discovering statistics using SPSS: And sex, drugs and rock “n” roll (3rd ed). Los Angeles: SAGE Publications.
  34. Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change, 114, 254–280. https://doi.org/10.1016/j.techfore.2016.08.019
  35. Fullagar, C. (1986). A factor analytic study on the validity of a union commitment scale. Journal of Applied Psychology, 71(1), 129–136. https://doi.org/10.1037/0021-9010.71.1.129
  36. Garrido, M., Sullivan, J., & Gordon, A. (2010). Understanding the links between ICT skills training and employability: An analytical framework. Proceedings of the 4th ACM/IEEE International Conference on Information and Communication Technologies and Development – ICTD ’10, 1–10. London, United Kingdom: ACM Press. https://doi.org/10.1145/2369220.2369234
  37. Gaudioso, F. (2015). Tecnostress: Stato dell’arte e prospettive d’intervento. Il  punto di vista psicosociale.
  38. Gazzetta Ufficiale. (2021). LEGGE 6 maggio 2021, n. 61. Retrieved from GAZZETTA UFFICIALE DELLA REPUBBLICA ITALIANA website: https://www.gazzettaufficiale.it/eli/id/2021/05/12/21G00071/sg
  39. Giorgi, G., Arcangeli, G., & Cupelli, V. (2012). Stress lavoro correlato.  Leader e collaboratori a confronto. Edises Napoli.
  40. Giorgi, G., Arcangeli, G., & Cupelli, V. (2013). Stress Questionnaire (SQ). Firenze, Italia: Hogrefe Press.
  41. Giorgi, G., Arcangeli, G., Mucci, N., & Cupelli, V. (2015). Economic stress in the workplace: The impact of fear of the crisis on mental health. Work, 51(1), 135–142. https://doi.org/10.3233/WOR-141844
  42. Golden, T. D., Veiga, J. F., & Dino, R. N. (2008). The impact of professional isolation on teleworker job performance and turnover intentions: Does time spent teleworking, interacting face-to-face, or having access to communication-enhancing technology matter? Journal of Applied Psychology, 93(6), 1412–1421. https://doi.org/10.1037/a0012722
  43. Gorsuch, R. L. (1988). Exploratory Factor Analysis. In J. R. Nesselroade & R. B. Cattell (Eds.), Handbook of Multivariate Experimental Psychology (pp. 231–258). Boston, MA: Springer US. https://doi.org/10.1007/978-1-4613-0893-5_6
  44. Hair, J. F. (Ed.). (2006). Multivariate data analysis (6th ed). Upper Saddle River, N.J: Pearson Prentice Hall.
  45. Hassard, J., Teoh, K. R. H., Visockaite, G., Dewe, P., & Cox, T. (2018). The cost of work-related stress to society: A systematic review. Journal of Occupational Health Psychology, 23(1), 1–17. https://doi.org/10.1037/ocp0000069
  46. Hudiburg, R. A. (1995). Psychology of Computer Use: XXXIV. The Computer Hassles Scale: Subscales, Norms, and Reliability. Psychological Reports, 77(3), 779–782. https://doi.org/10.2466/pr0.1995.77.3.779
  47. International Labour Organization (ILO). (2016). Workplace Stress: A Collective Challange. Retrieved from https://www.ilo.org/wcmsp5/groups/public/—ed_protect/—protrav/—safework/documents/publication/wcms_466547.pdf
  48. International Labour Organization (ILO). (2018). The impact of technology on the quality and quantity of jobs. Retrieved from https://www.ilo.org/wcmsp5/groups/public/—dgreports/—cabinet/documents/publication/wcms_618168.pdf
  49. International Labour Organization (ILO). (2020). Teleworking during the COVID-19 pandemic and beyond: A practical guide.
  50. Jackson, T. W., Dawson, R., & Wilson, D. (2003). Understanding email interaction increases organizational productivity. Communications of the ACM, 46(8), 80–84. https://doi.org/10.1145/859670.859673
  51. Jacukowicz, A., & Merecz-Kot, D. (2020). Work-related Internet use as a threat to work-life balance – a comparison between the emerging on-line professions and traditional office work. International Journal of Occupational Medicine and Environmental Health, 33(1), 21–33. https://doi.org/10.13075/ijomeh.1896.01494
  52. Johanson, G. A., & Brooks, G. P. (2010). Initial Scale Development: Sample Size for Pilot Studies. Educational and Psychological Measurement, 70(3), 394–400. https://doi.org/10.1177/0013164409355692
  53. Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31–36. https://doi.org/10.1007/BF02291575
  54. Karimikia, H., Singh, H., & Joseph, D. (2020). Negative outcomes of ICT use at work: Meta-analytic evidence and the role of job autonomy. Internet Research, 31(1), 159–190. https://doi.org/10.1108/INTR-09-2019-0385
  55. Klausegger, C., Sinkovics, R. R., & “Joy” Zou, H. (2007). Information overload: A cross‐national investigation of influence factors and effects. Marketing Intelligence & Planning, 25(7), 691–718. https://doi.org/10.1108/02634500710834179
  56. Kline, P. (1986). A handbook of test construction: Introduction to psychometric design. (pp. xi, 259). New York,  NY,  US: Methuen.
  57. Korunka, C., & Vitouch, O. (1999). Effects of the implementation of information technology on employees’ strain and job satisfaction: A context-dependent approach. Work & Stress, 13(4), 341–363. https://doi.org/10.1080/02678379950019798
  58. Korunka, C., Weiss, A., Huemer, K.-H., & Karetta, B. (1995). The Effect of New Technologies on Job Satisfaction and Psychosomatic Complaints. Applied Psychology, 44(2), 123–142. https://doi.org/10.1111/j.1464-0597.1995.tb01070.x
  59. Kraan, K. O., Dhondt, S., Houtman, I. L. D., Batenburg, R. S., Kompier, M. A. J., & Taris, T. W. (2014). Computers and types of control in relation to work stress and learning. Behaviour & Information Technology, 33(10), 1013–1026. https://doi.org/10.1080/0144929X.2014.916351
  60. La Torre, G., Esposito, A., Sciarra, I., & Chiappetta, M. (2019). Definition, symptoms and risk of techno-stress: A systematic review. International Archives of Occupational and Environmental Health, 92(1), 13–35. https://doi.org/10.1007/s00420-018-1352-1
  61. Lewis, D. (1996). Dying for information? An investigation into the effects of information overload in the UK and worldwide. London: Reuters.
  62. Liao, C. (2017). Leadership in virtual teams: A multilevel perspective. Human Resource Management Review, 27(4), 648–659. https://doi.org/10.1016/j.hrmr.2016.12.010
  63. Lyon, W. S. (1985). Analytically speaking: The column of our corresponding editor. Journal of Radioanalytical and Nuclear Chemistry Letters, 94(5), 287–290. https://doi.org/10.1007/BF02168259
  64. Mark, G., Voida, S., & Cardello, A. (2012). “A pace not dictated by electrons”: An empirical study of work without email. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 555–564. Austin Texas USA: ACM. https://doi.org/10.1145/2207676.2207754
  65. Marler, J. H., Liang, X., & Dulebohn, J. H. (2006). Training and Effective Employee Information Technology Use. Journal of Management, 32(5), 721–743. https://doi.org/10.1177/0149206306292388
  66. Marulanda‐Carter, L., & Jackson, T. W. (2012). Effects of e‐mail addiction and interruptions on employees. Journal of Systems and Information Technology, 14(1), 82–94. https://doi.org/10.1108/13287261211221146
  67. Mckeen, J. D., & Guimaraes, T. (1997). Successful Strategies for User Participation in Systems Development. Journal of Management Information Systems, 14(2), 133–150. https://doi.org/10.1080/07421222.1997.11518168
  68. McNall, L. A., & Stanton, J. M. (2011). Private Eyes Are Watching You: Reactions to Location Sensing Technologies. Journal of Business and Psychology, 26(3), 299–309. https://doi.org/10.1007/s10869-010-9189-y
  69. Ministero del Lavoro e delle Politiche Sociali. (2020). Sono più di 1 milione e 800  mila i lavoratori attivi in modalità smart working. Retrieved from https://www.lavoro.gov.it/stampa-e-media/Comunicati/Pagine/Sono-piu-di-1-milione-800-mila-i-lavoratori-attivi-in-modalita-smart-working.aspx
  70. Mucci, N., Giorgi, G., Cupelli, V., Gioffrè, P. A., Rosati, M. V., Tomei, F., … Arcangeli, G. (2015). Work-related stress assessment in a population of Italian workers. The Stress Questionnaire. Science of The Total Environment, 502, 673–679. https://doi.org/10.1016/j.scitotenv.2014.09.069
  71. Nam, T. (2014). Technology Use and Work-Life Balance. Applied Research in Quality of Life, 9(4), 1017–1040. https://doi.org/10.1007/s11482-013-9283-1
  72. O’Driscoll, M. P., Brough, P., Timms, C., & Sawang, S. (2010). Engagement with information and communication technology and psychological well-being. In P. L. Perrewé & D. C. Ganster (Eds.), Research in Occupational Stress and Well-being (Vol. 8, pp. 269–316). Emerald Group Publishing Limited. https://doi.org/10.1108/S1479-3555(2010)0000008010
  73. Parasuraman, A., Zeithaml, V. A., & Malhotra, A. (2005). E-S-QUAL: A Multiple-Item Scale for Assessing Electronic Service Quality. Journal of Service Research, 7(3), 213–233. https://doi.org/10.1177/1094670504271156
  74. Parsons, C. K., Liden, R. C., O’Connor, E. J., & Nagao, D. H. (1991). Employee Responses to Technologically-Driven Change: The Implementation of Office Automation in a Service Organization. Human Relations, 44(12), 1331–1356. https://doi.org/10.1177/001872679104401206
  75. Peters, M. A. (2017). Technological unemployment: Educating for the fourth industrial revolution. Educational Philosophy and Theory, 49(1), 1–6. https://doi.org/10.1080/00131857.2016.1177412
  76. Ragu-Nathan, T. S., Tarafdar, M., Ragu-Nathan, B. S., & Tu, Q. (2008). The Consequences of Technostress for End Users in Organizations: Conceptual Development and Empirical Validation. Information Systems Research, 19(4), 417–433. https://doi.org/10.1287/isre.1070.0165
  77. Rangarajan, D., Jones, E., & Chin, W. (2005). Impact of sales force automation on technology-related stress, effort, and technology usage among salespeople. Industrial Marketing Management, 34(4), 345–354. https://doi.org/10.1016/j.indmarman.2004.09.015
  78. Rhoads, M. (2010). Face-to-Face and Computer-Mediated Communication: What Does Theory Tell Us and What Have We Learned so Far? Journal of Planning Literature, 25(2), 111–122. https://doi.org/10.1177/0885412210382984
  79. Riedl, R., Kindermann, H., Auinger, A., & Javor, A. (2012). Technostress from a Neurobiological Perspective: System Breakdown Increases the Stress Hormone Cortisol in Computer Users. Business & Information Systems Engineering, 4(2), 61–69. https://doi.org/10.1007/s12599-012-0207-7
  80. Riedl, R., Kindermann, H., Auinger, A., & Javor, A. (2013). Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance. Advances in Human-Computer Interaction, 2013, 1–8. https://doi.org/10.1155/2013/420169
  81. Salanova, M., Llorens, S., & Cifre, E. (2013). The dark side of technologies: Technostress among users of information and communication technologies. International Journal of Psychology, 48(3), 422–436. https://doi.org/10.1080/00207594.2012.680460
  82. Sami, L. K., & Pangannaiah, N. B. (2006). “Technostress” A literature survey on the effect of information technology on library users. Library Review, 55(7), 429–439. https://doi.org/10.1108/00242530610682146
  83. Shu, Q., Tu, Q., & Wang, K. (2011). The Impact of Computer Self-Efficacy and Technology Dependence on Computer-Related Technostress: A Social Cognitive Theory Perspective. International Journal of Human-Computer Interaction, 27(10), 923–939. https://doi.org/10.1080/10447318.2011.555313
  84. Sostero, M., Milasi, S., Hurley, J., Fernandez-Macías, E., & Bisello, M. (2020). Teleworkability and the COVID-19 crisis: A new digital divide? Eurofound – European Commission.
  85. Staples, D. S. (2001). A Study of Remote Workers and Their Differences from Non-Remote Workers: Journal of Organizational and End User Computing, 13(2), 3–14. https://doi.org/10.4018/joeuc.2001040101
  86. Strauss, M. E., & Smith, G. T. (2009). Construct Validity: Advances in Theory and Methodology. Annual Review of Clinical Psychology, 5(1), 1–25. https://doi.org/10.1146/annurev.clinpsy.032408.153639
  87. Streiner, D. L., Norman, G. R., & Cairney, J. (2015). Health measurement scales: A practical guide to their development and use (Fifth edition). Oxford: Oxford University Press.
  88. Tabachnick, B. G., & Fidell, L. S. (1996). Using multivariate statistics (3rd ed). New York, NY: HarperCollins College Publishers.
  89. Tarafdar, M., Tu, Q., Ragu-Nathan, B. S., & Ragu-Nathan, T. S. (2007). The Impact of Technostress on Role Stress and Productivity. Journal of Management Information Systems, 24(1), 301–328. https://doi.org/10.2753/MIS0742-1222240109
  90. Tarafdar, M., Tu, Q., & Ragu-Nathan, T. S. (2010). Impact of Technostress on End-User Satisfaction and Performance. Journal of Management Information Systems, 27(3), 303–334. https://doi.org/10.2753/MIS0742-1222270311
  91. van der Molen, H. F., Nieuwenhuijsen, K., Frings-Dresen, M. H. W., & de Groene, G. (2020). Work-related psychosocial risk factors for stress-related mental disorders: An updated systematic review and meta-analysis. BMJ Open, 10(7), e034849. https://doi.org/10.1136/bmjopen-2019-034849
  92. Vayre, E., & Pignault, A. (2014). A systemic approach to interpersonal relationships and activities among French teleworkers: French teleworkers’ relationships and activities. New Technology, Work and Employment, 29(2), 177–192. https://doi.org/10.1111/ntwe.12032
  93. Venkatesh, V., & Davis, F. D. (1996). A Model of the Antecedents of Perceived Ease of Use: Development and Test. Decision Sciences, 27(3), 451–481. https://doi.org/10.1111/j.1540-5915.1996.tb01822.x
  94. Wang, K., Shu, Q., & Tu, Q. (2008). Technostress under different organizational environments: An empirical investigation. Computers in Human Behavior, 24(6), 3002–3013. https://doi.org/10.1016/j.chb.2008.05.007
  95. Weil, M. M., & Rosen, L. D. (1997). TechnoStress: Coping with technology @work @home @play. New York: J. Wiley.
  96. Wells, D. L., Moorman, R. H., & Werner, J. M. (2007). The impact of the perceived purpose of electronic performance monitoring on an array of attitudinal variables. Human Resource Development Quarterly, 18(1), 121–138. https://doi.org/10.1002/hrdq.1194
  97. Wiesenfeld, B. M., Raghuram, S., & Garud, R. (2001). Organizational identification among virtual workers: The role of need for affiliation and perceived work-based social support. Journal of Management, 27(2), 213–229. https://doi.org/10.1177/014920630102700205