It’s no secret that remote work is becoming increasingly popular across various companies and public institutions. This work mode requires us to use a computer from home to communicate with the office. But have you ever stopped to think about the actual cost of using your computer?
For many businesses and public institutions, telecommuting became the norm in the wake of the health crisis. During this challenging time, numerous employers found that their employees were more productive when working from home. However, this work mode does come with additional costs, especially an increased electricity consumption. After all, you often need to light up your workspace, heat your room, and power your computer.
How much electricity does a computer consume?
The primary concern here is to shed light on the costs associated with using computer equipment for remote work. However, the electricity consumption of computers varies based on several factors. Laptops, equipped with chargers, don’t need to be plugged in continuously, unlike a conventional desktop computer. In the case of laptops, chargers typically demand between 20 and 60 watts per hour.
On the other hand, a desktop "tower" can consume up to 200 watts per hour and usually needs to remain plugged in throughout the work duration. The annual cost difference between these two types of computers is significant: from around thirty euros for laptops to over a hundred for desktops!
Given this, many companies offer compensation to their employees in the form of a fixed allowance. This is typically granted on a monthly basis and paid alongside the salary. If you’re curious about such compensations, don’t hesitate to reach out to your employer or human resources department for more information.