With PCs you can upgrade your rig, there's a steady progression so each year the "average" PC is a bit better than last year's. Game developers have to choose between higher specs for a flashier game or lower specs for a larger audience and potentially more sales. The important thing to note is they say what the recommended specs are so people with a lower spec system understand they will have a substandard experience.
With a console there is only one set of specs. It doesn't matter if the PS4 was released 7 years ago, people haven't been swapping out processors, RAM and/or gfx cards so the average PS4 specs are about the same as they were at launch.
If people are complaining the graphics are substandard on the PS4 it could be the publisher false advertising by saying they could expect the PC's standards or the player could have failed to temper their expectations. On the other hand if the framerate is dropping to a level where it's detrimental to the game then they put out a shit product.
- - - Updated - - -
There is one humongous difference. With consoles the developer knows exactly what hardware they are dealing with so if they want to release a decent product that version of the game should be optimised for that hardware. If they release a game that runs like crap on the system it should be optimised for then it is a shit product.