I was installing Steam on a laptop recently and had the presence of mind to remember to remove Steam from the “startup program” list that it automatically adds itself to on install. I generally don’t like programs automatically running on startup because when shutting down or restarting my computers, I really prefer they come back in a “clean” state. That way if I’m experiencing some odd behavior on my PC that is potentially caused by some background program running, I can be reasonably confident that the issue will resolve itself after a restart. Most programs really have no business running on startup. The only exception I can think of in the moment are cloud file systems like OneDrive where not running the program puts me at risk of torn cloud sync state that is annoying to resolve.
Which got me thinking, why does Steam do this?
I suspect the uninteresting answer is that it lets them run an auto-updater check on every machine start with the side benefit that it lets them surface ads for steam sales and new titles every time the user starts up their machine.
I don’t agree with the “ads” justification I’m imagining but as a software developer, I could see myself wanting to reduce the chances of users hosting really outdated steam clients on their PCs. But the key word there is “hosting”.
You don’t need to run automatically on every boot to do an auto-updater check. You can just do that whenever the user opens Steam and of course the Steam client does this as well. So, there’s probably a UX angle here. Wanting to reduce the chance that users see this auto-update screen during the moments when they’re actively headed to play a game.
The following is a scenario that many game platforms (Steam, Xbox, PlayStation, etc.) put a lot of thinking into trying to prevent:
- You boot up
[game platform of choice]
to go play[game of choice]
- For any number of reasons, your time to play is limited (you’re busy, you’re playing with someone else and have to sync schedules, etc.)
- You’re blocked by an update screen and waste precious play time doing what feels like admin work.
- If you’re particularly unlucky, after updating the platform itself there’s an update for the game to take as well
- Before you know it, a significant portion of your play time is lost.
Anyway, that’s a user scenario that I can empathize with, but, by default, negatively affecting a user’s boot time doesn’t feel like the right call. You could argue that because it can be disabled, it’s well within the user’s power to tweak this setting to their liking and the average user may not be tech-literate enough to enable such a setting themselves. I’d argue the inverse is true as well and a software culture that’s opt-out rather than opt-in is likely to hurt these less-tech-literate users in the long run. They’re the users whose computers will slow to crawl and who won’t feel confident to fix it themselves.
An alternate idea to make this opt-in for users would be to present them a screen at the end of an auto-update that encourages them to enable “run-on-startup” with a justification and a “don’t ask me again” box if they choose to continue without it. This is of course a simple, obvious solution and I don’t think it was overlooked. I just suspect there are other business incentives (e.g. user telemetry) that encourage the current opt-out pattern and that frustrates me.
But I’d also like to be open to the possibility of having missed something which might be non-obvious but nonetheless a good justification that I haven’t yet considered. Maybe if I ever learn of one, I’ll remember to update this post in the future. My own history as a software developer has made me more jaded/empathetic to some “dark side” software development strategies and maybe I’ll find myself on a project someday faced with a similar UX design choice. Maybe I’ll have the gall to push back or maybe I’ll roll over so I can log off at 5PM and decide it’s not worth the fight.