My absurd but honest opinion here.
There's a point around 18-20 fps, where it is too jittery for me to play. I've played games that dropped like that on PC, PS4. Medieval has a very rough level...
But a consistent 30 fps game feels absolutely fine for me.
20 to 30 is a monstrous jump in my opinion. 60 fps is better, but not so much that I doubt I would even notice every time. I've never felt a huge difference. I am sure I'd guess which game was running at 60 90+% of the time, but it's not a feature I've ever felt was an absolute necessity.
The framerate bit is about looking visually smoother. If you already think motion looks visually smooth you're all good, I think?
Some people talk about gameplay enhancements but thats not really true. It's not uncommon for games to calculate logic and other things at different framerates than the visuals.
(There's also other factors for input. Like Dark Souls has added input lag. Bloodborne feels so much smoother than it, despite having half the framerate).
-----
Whether 4K is a worthwhile upgrade is kind of tough.
I went from a fairly large standard definition TV to a full HD one. That was an easy giant upgrade. There was no doubt that things looked blurry on the old TV. There were times I couldn't even make out some things in a game. It was that bad.
4K is a smaller upgrade, and it's even smaller still due to diminishing returns. At some point 4x the pixels is no longer distinguishable.
And of course for these discussions, always important to remember if you are not close enough to the screen, 4K literally can't make a visible difference for you.
Additionally that is going to tend to be the case if you're playing content that isnt 4k. You'll be at the mercy of whatever upscaler your TV has.
Demon's Souls was disappointing. Looks great but imo but not the wow factor like Killzone Shadowfall when factoring out resolution.
Several people on Era said that Ratchet was the first time they felt that Killzone next gen rush.