When something is practically correct, but not necessarily correct.Technically, and only under very specific circumstances, you should be able to even decrease GPU workload by increasing resolution. Like if there are a whole bunch of on screen particle effects and at lower resolutions they overlap more.
Something that is usually true, but doesn't have to be true.
Like its usually true that higher resolution is more GPU intensive than CPU intensive. That will probably be true for every game out there, relatively speaking.
But it doesn't have to be true.
You could make some weird design decisions where graphics get simpler at higher resolutions and NPCs multiply and be harder on the CPU.
You wouldn't ever actually do that, but it's possible.
If you're including everything else that is often scaled with graphics, CPU can easily be the bottleneck. Like Minecraft can run at a high resolution but to feed that higher resolution you'd want better draw distances and at least on old Minecraft that'd require a lot more CPU usage. (environment chunks need to be calculated but also the meshes need to be generated on the CPU for the GPU to render)