[Proposal] Understanding Participants' Psychological Wellbeing at Virtual Conferences

Description of the theoretical and practical importance of the research

Purpose statement. The purpose of the study is to develop a multi-item, multi-dimensional scale assessing online tools that contribute to virtual conference participants’ psychological wellbeing.

Justification and importance. The COVID-19 pandemic of 2020 has changed the way people work, learn, and interact with each other. One of the most prominent features of the new reality is a heavy reliance on technology. A recent poll conducted by the Pew Research Center revealed that 81% of adults working from home using online conferencing services, and 57% use instant messaging platforms (Parker et al., 2020). While technology created wonderful opportunities for working remotely, it also created threats for user wellbeing.

A series of recent studies have focused on the negative impacts of technology and remote work. For example, Fauville et al. (2021) developed the Zoom Exhaustion & Fatigue Scale based on the evidence of cognitive load associated with video conferencing. Moreover, people’s fatigue and productivity loss can be exacerbated by the detrimental impact of the COVID-19 pandemic on mental health (Karatepe et al., 2021). Although previous research has illuminated the issues related to the harmful effects of technology, the challenge of making people productive and thriving in the new work environment using technological tools remains to be addressed. In line with the positive psychology approach that emphasizes increasing 'people’s flourishing, as opposed to the traditional psychology goal of decreasing people suffering (Seligman, 2012), this study focuses on the beneficial elements of the online environment and its impact on 'people’s wellbeing.

Moreover, in response to a growing need to address best practices of organizing online events, several studies in the broader literature explored online tools in the context of virtual conferences and developed guidelines on maximizing their effectiveness (Reshef et al., 2020; Roos et al., 2020; Rubinger et al., 2020). However, research on the psychological outcomes of online events remains limited, despite the emergence of new measures, such as PERMA (positive emotions, engagement, relationships, meaning, achievement) approach that allows exploring the meaning of events (Armbrecht et al., 2019; Seligman, 2012). Therefore, this study investigates virtual conference attendees’ wellbeing based on the PERMA framework (Seligman, 2012). Additionally, this research provides an instrument to measure the impact of online tools that support the attendees’ wellbeing across the PERMA dimensions.

Methods and Procedures

This study followed widely adopted scale development procedures (Boateng et al., 2018; Churchill, 1979; DeVellis, 2016; Gerbing & Anderson, 1988) in three phases:

Phase 1. Item development.

This phase includes defining the domain for the scale and identifying the questions that would be appropriate for measuring the participants’ wellbeing during the online events. This phase involved compiling a list of online tools supporting the planning and coordinating of virtual events from the academic and meetings and events industry literature. The prepared list was distributed to six (6) experts (professors specializing in hospitality technology and event management) for their review.

Phase 2. Scale development.

This phase involves distribution of the scale items to the target sample, item reduction and purification, and identification of the factor structure for the scale that is being developed. An exploratory factor analysis (EFA), item-to-total correlation, and coefficient alpha will be performed for item purification and latent structure identification (Churchill, 1979).

Participants will be recruited on Prolific using a two-step approach. First, a screening study will be set up to identify the pool of participants who have attended an online conference to be included in the Custom AllowList. Then, the main study will be conducted. After reading a brief description of the study, respondents will be asked to describe their recent virtual conference experience. Next, they will rate each of the measurement items on a 7-point Likert-type scale from 1 (strongly disagree) to 7 (strongly agree), indicating the importance of each item for their phycological wellbeing at an online conference.

Phase 3. Scale evaluation.

At this phase, a refined survey is distributed to the target sample to collect responses and assess the “”"" robustness of the scale. The survey will be identical to the one used in Phase 2 with the exception of the measurement items that will be excluded during the scale purification. The scale evaluation will include an assessment of scale reliability and validity. A confirmatory factor analysis (CFA) will be used to assess the scale for reliability (coefficient alpha and composite reliability) and validity (convergent, discriminant, and criterion validity) (Boateng et al., 2018; Churchill, 1979; Gerbing & Anderson, 1988). The scale will be finalized based on the results of these analyses.

Sample size estimation

Phase 1 Item development is completed by the researchers and expert judges, and, therefore, is not included in the sample size calculation in this proposal.

Phase 2 Scale development and Phase 3 Scale evaluation require data collection for EFA and CFA. Boateng et al. (2018) explained that a sample size for the scale development process may be calculated based on the number of items included in the scale or independently of this consideration. To avoid uncertainty brought to the process by the number of measurement items that will be left in the survey by Phase 2 and Phase 3, this research uses an alternative approach that does not rely on the number of measurement items of the sample size calculation.

Comrey and Lee (1992, as cited in Boateng et al., 2018) suggest that a sample size of 1,000 respondents or more is an excellent sample size for the purposes of scale development. Following this recommendation, this study aims for a sample size of 1,200 respondents to include a buffer that may protect the sample size from the data loss during the data cleaning. Also, this sample size is likely to accommodate the requirements of 10 participants to 1 measurement item ratio suggested by Hair et al. (2018). This sample size will be shared equally between Phases 2 and 3 of this research.

We have applied a custom prescreening on Prolific to identify U.S. residents fluent in English, who are employed part-time or full-time and use technology at work at least several times a week to create a profile of a respondent who is likely to attend online conferences. The prescreening identified 11,651 members active in the last 90 days, which suggests that it should be feasible to recruit the desired sample on Prolific.

Description of the study costs

We will survey the entire pool of prescreened Prolific participants (11,651) in an effort to form a Custom AllowList of those who have attended an online conference. Assuming the 40% response rate reported by Prolific (2018), we expect to receive 4660 responses. According to Eventbrite, about 62% of people attended virtual events in July 2020 (Powell, 2020). Moreover, many prominent companies decided to pivot to virtual events in 2020, such as Microsoft Build Conference 2021 and CES 2021, one of the world’s most influential technology events (CES, n.d.; Microsoft, n.d.). Using this estimate, we expect about 2,889 individuals to qualify for the inclusion into our Custom AllowList. Then, obtaining a sample of 1,200 respondents constitutes a 41% response rate, which is within the expected response rate range reported by Prolific (2018).

Therefore, the cost of data collection could be estimated as follows:

Pre-screening: 4,660 * £0.13 (1-minute survey * £7.80/hour) + 33% service fee = £807.73

Main study: 1,200 * £2.50 (20-minute survey * £7.50/hour) + 33% service fee = £4,000.00

Study total: £4,807.73

Open Science Commitment

This study is pre-registered on AsPredicted.org and is available by following this link AsPredicted: Captcha. The authors are committed to sharing the data files via open-source channels (e.g., OSF, author’s website) and publishing the manuscript in an open-source journal identified via the Directory of Open Access Journals (DOAJ) database.


Armbrecht, J., Lundberg, E., & Andersson, T. D. (Eds.). (2019). A research agenda for event management. Edward Elgar Publishing.

Boateng, G. O., Neilands, T. B., Frongillo, E. A., Melgar-Quiñonez, H. R., & Young, S. L. (2018). Best practices for developing and validating scales for health, social, and behavioral research: a primer. Frontiers in public health, 6 , 149.

Churchill, G. A., Jr. (1979). A paradigm for developing better measures of marketing constructs. Journal of Marketing Research, 16 (1), 64-73.

DeVellis, R. F. (2016). Scale development: Theory and applications (3rd ed.). Sage.

Fauville, G., Luo, M., Muller Queiroz, A. C., Bailenson, J. N., & Hancock, J. (2021). Zoom Exhaustion & Fatigue Scale. Available at SSRN 3786329. Zoom Exhaustion & Fatigue Scale by Geraldine Fauville, Mufan Luo, Anna Carolina Muller Queiroz, Jeremy N. Bailenson, Jeff Hancock :: SSRN

Gerbing, D. W., & Anderson, J. C. (1988). An updated paradigm for scale development incorporating unidimensionality and its assessment. Journal of Marketing Research, 25(2), 186-192.

Hair, J.F., Black, W.C., Babin, B.J., & Anderson, R.E. (2018). Multivariate data analysis . (8th ed.). Cengage.

Karatepe, O. M., Saydam, M. B., & Okumus, F. (2021). COVID-19, mental health problems, and their detrimental effects on hotel employees’ propensity to be late for work, absenteeism, and life satisfaction. Current Issues in Tourism, 1 -18.

Parker, K., Menasce Horowitz, J., & Minkin, R. (2020, December 9). How the coronavirus outbreak has – and 'hasn’t – changed the way Americans work . Pew Research Center. How Coronavirus Has Changed the Way Americans Work | Pew Research Center

Powell, O. (2020, August 13). Eventbrite turns focus on virtual events. Conference News. Eventbrite turns focus on virtual events | Conference News

Prolific (2018, September 12). Using our demographic filters to prescreen participants. Using our demographic filters to prescreen participants – Prolific

Reshef, O., Aharonovich, I., Armani, A. M., Gigan, S., Grange, R., Kats, M. A., & Sapienza, R. (2020). How to organize an online conference. Nature Reviews Materials, 5 (4), 253-256.

Roos, G., Oláh, J., Ingle, R., Kobayashi, R., & Feldt, M. (2020). Online conferences – towards a new (virtual) reality. Computational and Theoretical Chemistry, 1189 , 112975. Redirecting

Rubinger, L., Gazendam, A., Ekhtiari, S., Nucci, N., Payne, A., Johal, H., Khanduja, V., & Bhandari, M. (2020). Maximizing virtual meetings and conferences: A review of best practices. International Orthopaedics, 44 (8), 1461-1466. Maximizing virtual meetings and conferences: a review of best practices - International Orthopaedics

Seligman, M. E. (2012). Flourish: A visionary new understanding of happiness and well-being . Simon and Schuster.

Microsoft (n.d.). Microsoft Build. https://mybuild.microsoft.com/home

CES (n.d.). CES 2021 on demand. Consumer Technology Association. CES 2021 Program Highlights - CES 2022