Skip to main content

4 types of Zoom fatigue and how to combat them

 https://www.fastcompany.com/90608399/4-types-of-zoom-fatigue-and-how-to-combat-them

That bleary-eyed, foggy-brained feeling of “Zoom fatigue” is a widely accepted pandemic phenomenon—but how can you prevent it? And what exactly causes it?

Researchers at Stanford University just released the first peer-reviewed, psychological study of Zoom fatigue, and its results are surprising. Researchers found four quite different causes, as well as helpful solutions for each:

1. Close-up eye contact is exhausting

In a typical Zoom discussion, the amount of intensive eye contact far exceeds what you would experience in real-life interactions. Think about it: When you take a walk-and-talk with a friend, you might have mere moments of eye contact; in a conference room, listeners look at their screens and their notes or gaze out the window. At the same time, Zoom faces are typically larger and closer than you’d experience in real-life work discussions, which fool your mind into perceiving an intensely intimate conversation. “In effect, you’re in this hyperaroused state,” says Jeremy Bailenson, founding director of Stanford’s Virtual Human Interaction Lab.

The fix: Minimize the face sizes of attendees into grid view, and sit back a bit to allow yourself more personal space.

2. Watching yourself is exhausting

In real life, you are not followed by a mirror, and you might spend five minutes a day looking at your reflection. The researchers cite studies showing that when seeing one’s own reflection, people are more critical of themselves. “It’s taxing on us. It’s stressful. And there’s lots of research showing that there are negative emotional consequences to seeing yourself in a mirror,” says Bailenson.

The fix: Confirm that your lighting and setup look good, and then adjust the settings to hide your view of yourself.

 


3. Sitting immobile is exhausting

In typical in-person discussions, people move around. On Zoom, people sit immobile for hours on end. “There’s a growing research that says that when people are moving, they’re performing better cognitively,” says Bailenson.

The fix: Create a wider visual field for your camera. For example, an external camera often allows you more space to move than a laptop camera, because you no longer need to remain within arm’s reach of the keyboard.

4. Video chatting is cognitively exhausting

Your brain works much harder to send and receive cues through a screen. Multiply that into hours of exaggerated expressions and increased concentration, and your mind simply consumes more power.

The fix: When it’s feasible, turn off your camera for breaks—and turn your body away from the screen.

Bottom line

The researchers hope that videoconferencing apps will incorporate solutions to these problems into their basic setups.

By the way, none of these fatigue inducers are specific to Zoom—they apply to all videoconferencing. We do not envy the Zoom marketing staff tasked with removing “Zoom fatigue” from the lexicon. You can take a quiz on your own Zoom fatigue here.

 

 

 

 

 

Comments

Popular posts from this blog

The Difference Between LEGO MINDSTORMS EV3 Home Edition (#31313) and LEGO MINDSTORMS Education EV3 (#45544)

http://robotsquare.com/2013/11/25/difference-between-ev3-home-edition-and-education-ev3/ This article covers the difference between the LEGO MINDSTORMS EV3 Home Edition and LEGO MINDSTORMS Education EV3 products. Other articles in the ‘difference between’ series: * The difference and compatibility between EV3 and NXT ( link ) * The difference between NXT Home Edition and NXT Education products ( link ) One robotics platform, two targets The LEGO MINDSTORMS EV3 robotics platform has been developed for two different target audiences. We have home users (children and hobbyists) and educational users (students and teachers). LEGO has designed a base set for each group, as well as several add on sets. There isn’t a clear line between home users and educational users, though. It’s fine to use the Education set at home, and it’s fine to use the Home Edition set at school. This article aims to clarify the differences between the two product lines so you can decide which

Let’s ban PowerPoint in lectures – it makes students more stupid and professors more boring

https://theconversation.com/lets-ban-powerpoint-in-lectures-it-makes-students-more-stupid-and-professors-more-boring-36183 Reading bullet points off a screen doesn't teach anyone anything. Author Bent Meier Sørensen Professor in Philosophy and Business at Copenhagen Business School Disclosure Statement Bent Meier Sørensen does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations. The Conversation is funded by CSIRO, Melbourne, Monash, RMIT, UTS, UWA, ACU, ANU, ASB, Baker IDI, Canberra, CDU, Curtin, Deakin, ECU, Flinders, Griffith, the Harry Perkins Institute, JCU, La Trobe, Massey, Murdoch, Newcastle, UQ, QUT, SAHMRI, Swinburne, Sydney, UNDA, UNE, UniSA, UNSW, USC, USQ, UTAS, UWS, VU and Wollongong.

Logic Analyzer with STM32 Boards

https://sysprogs.com/w/how-we-turned-8-popular-stm32-boards-into-powerful-logic-analyzers/ How We Turned 8 Popular STM32 Boards into Powerful Logic Analyzers March 23, 2017 Ivan Shcherbakov The idea of making a “soft logic analyzer” that will run on top of popular prototyping boards has been crossing my mind since we first got acquainted with the STM32 Discovery and Nucleo boards. The STM32 GPIO is blazingly fast and the built-in DMA controller looks powerful enough to handle high bandwidths. So having that in mind, we spent several months perfecting both software and firmware side and here is what we got in the end. Capturing the signals The main challenge when using a microcontroller like STM32 as a core of a logic analyzer is dealing with sampling irregularities. Unlike FPGA-based analyzers, the microcontroller has to share the same resources to load instructions from memory, read/write the program state and capture the external inputs from the G