UAC was to interruptive for most peoples tastes. The drivers were not Vista's fault, lazy driver devs were to blame. And I got it on a top of the line system where it ran smooth. Like you say people put it on underpowered systems and expected it to be the best thing ever.
Userfriendliness was improved by the time service packs were released. By then the driver issue was solved as well. But the damage was already done.
But what most people don;t realize: Win7 uses the same driver model as Vista. If Win7 were released when Vista was, we would now all be talking about terrible drivers in Win7!
I like 10 a lot. I was too young to even remember Vista really. I think my first laptop had it, but I just wanted to play WoW. 7 was dope though.
IMO no, the O/S itself was a good improvement on XP and brought a lot of new features.
The reason it has a bad rep, is because in the time since XP launched the amount of people using PC's had increased dramatically due to the .com era explosion of the internet and due to XP having made computers more user friendly than 9x. This meant that there were a large amount of newer users users who didn't understand that running a brand new O/S on older hardware would result in terrible performance (or that buying a budget PC with it on that barely met the recommended specs was a bad idea), and so when Vista launched with the exact same issues as XP and 9x had, the noobs raged.
The most comical thing about the debacle was that when Microsoft released a revised version of Vista called "7" roughly three years later, it was well received and became universally loved (due to having essentially the same requirements three years later).
ya I realize it was not all Vista's fault, I've used Vista drivers on win7 computers a number of times, *Printer drivers most of all, when the manufacturer went to universal drivers and it didn't support all the features of the printer, the vista one would still on Win7. I always had my gaming computer set with the newest OS for the directx updates and since it was built for gaming it handled vista with no issues.
Vista was a great OS. It allowed the normal user to perform the elevation for local administration task (try to disconnect a network adapter in XP!), and removal of "Power Users" group and Parental controls solved the computer virus problems once and for all.
Also, the new faking OEM BIOS activation method allowed all the users to install updates without the risk of invalidation of their system. No longer WINLOGON patching and constant fear.
Last edited by Tackhisis; 2017-04-21 at 11:48 AM.
It was mostly down to 10 years of shit design and security.
UAC was fucking nuts. No software was expecting it, most just wrote files all over the place.
By the time most software had caught up and stopped throwing security errors because apparently writing user files into the Program Files folder is a crap idea, Vista has already got a reputation for it. Most just stuck to XP until 7 came out and games stopped working under XP because it needed a later version of DirectX. A trick they later used with DX12 under Windows 10. The only way to get people to upgrade is to demand it for new games.
Don't know, I've never used it
All right, gentleperchildren, let's review. The year is 2024 - that's two-zero-two-four, as in the 21st Century's perfect vision - and I am sorry to say the world has become a pussy-whipped, Brady Bunch version of itself, run by a bunch of still-masked clots ridden infertile senile sissies who want the Last Ukrainian to die so they can get on with the War on China, with some middle-eastern genocide on the side
Yeah, I had a smooth experience with Vista as well. I had built a 64-bit system and Windows XP 64-bit had about zero driver support so if I wanted to use all of my system, I really needed Vista at the time, and it ran things much smoother than XP. Perhaps it's because I built a pretty good machine at the time.
Yeah same here. I built a good comp back in the day and opted for 64 bit Vista because 64-bit XP was truly one of the worst OS's Microsoft ever created and I didn't want to be memory limited with a 32 bit OS.
Vista worked fine for me, 7 was still better in every way though and i upgraded as soon as possible.
I skipped 7 entirely. I only upgrade OS when I upgrade parts and did not upgrade parts until after 8 was released, so went to 8, which I also had no problems with despite people hating it so much. Again though, I had brand new hardware that was very good, so no problems with a new OS.
Vista had a few issues -
- It only ran well on hardware that wasn't readily available to the masses
- They promised a lot with Windows Longhorn and couldn't deliver so there were a lot of bad PR cycles
- A lot of the features that we're used to in W7+ were just in their infancy in Vista and were unstable/bad
Windows 7 is basically a fixed version of Vista and while it was a bad OS, its still an OS that had to come at some point. The same story can be said with Windows ME and 8.
Just realized I rambled without answering the question.
Yes it was bad,
Why ?
You needed new hardware to properly run it.
It was bloated.
EAX support was gone due to the removal of DirectSound.
Many of the promised features never appeared.
DX10 turned out to be hype from a user aspect.
I don't remember much else, I didn't touch it until Bioshock came out.
Last edited by MrPaladinGuy; 2017-04-21 at 01:47 PM.
10850k (10c 20t) @ all-core 5GHz @ 1.250v | EVGA 3080 FTW3 Ultra Gaming | 32GB DDR4 3200 | 1TB M.2 OS/Game SSD | 4TB 7200RPM Game HDD | 10TB 7200 RPM Storage HDD | ViewSonic XG2703-GS - 27" IPS 1440p 165Hz Native G-Sync | HP Reverb G2 VR Headset
That's a complicated question. Vista's perception involves not just the OS but a lot of surrounding context.
1) It had bugs at launch. Every previous OS did. People that remember XP fondly probably weren't there for, or don't remember, XP's launch. Until SP1 they were both really doggy.
2) Vista was late because it DIDN'T ship with all of its promised features and many half done things needed to be ripped out. So you have a new product that took longer to go gold with not much noticeable user level improvements.
3) Vista DID ship with a more secure driver layer, but that broke a lot of apps that were use to doing anything it wanted with memory and the OS. A lot of manufactures didn't prioritize having proper working drivers (potentially so people would be forced to buy new products that would work) so a lot of software and hardware that worked under XP suddenly stopped working. And UAC was really aggressive to the point that it pissed people off or drove them to turn it off.
4) Microsoft was pressured by PC manufactures to keep the stated requirements low. This lead to a lot of underpowered PCs trying to run the OS, often they could only do it by turning off a lot of UI features (ex. Areo) and it looked to many users like they just wasted money for another copy of XP.
Windows 7 was essentially all that Windows Vista was trying to be. It had the added benefit of having Vista do the hard work and taking it on the chin, and Microsoft not backing down on reasonable requirements to run it. A spectrum of UAC options helped so people could adjust how nannying it was without fully disabling it.
Was it bad overall? Not really. Like with XP SP1 corrected a lot of issues, and SP2 refined it. It was perfectly serviceable as long as you weren't trying to run it on a potato. Personally I never had a problem with it outside of those early buggy days.
Last edited by Gamer8585; 2017-04-21 at 01:48 PM.
It is, though. It's literally Kernel 10. There's a few ways to look at it. Win 10 was originally going to be the 10th main version of windows, given that 8 and 8.1 are effectively totally different versions. 8.1 is not just a service pack. It's a completely (mostly) new kernel. 8 is as different from 8.1 as Vista is from 7. Win 10 was originally going to be 6.4, but they changed it for a number of reasons (one being that it will have multiple revisions. We're on 10.4 now). But it is definitely the 10th release AND it's kernel # is 10 as well. It would be silly for them to have named Windows 10 kernel 7, 7.1, 7.2, etc.
Win Version Kernel # Market # Win 1 1 1 Win 2 2 2 Win 3 3 3 Win 95/98 4.0/4.1 4 Win 2k/XP 5.0/5.1 5 Win Vista 6.0 6 Win 7 6.1 7 Win 8 6.2 8 Win 8.1 6.3 9 Win 10 10.0 10
Gaming: Dual Intel Pentium III Coppermine @ 1400mhz + Blue Orb | Asus CUV266-D | GeForce 2 Ti + ZF700-Cu | 1024mb Crucial PC-133 | Whistler Build 2267
Media: Dual Intel Drake Xeon @ 600mhz | Intel Marlinspike MS440GX | Matrox G440 | 1024mb Crucial PC-133 @ 166mhz | Windows 2000 Pro
IT'S ALWAYS BEEN WANKERSHIM | Did you mean: Fhqwhgads"Three days on a tree. Hardly enough time for a prelude. When it came to visiting agony, the Romans were hobbyists." -Mab
Given that the release was years late, and overhyped, this set the scene for abysmal failure. And yes, for most people, it was truly horrible. Reason wasn't that the core OS sucked. In fact, the core had several improvements over XP. However, out-of-the-box the OS immediately started indexing your entire hard-disk, which made even the faster machines bog down completely, often for several days leading to extremely poor first impressions and reception. Several other processes piled on making it a nightmare unless you knew how and what to turn off (and keep it turned off, which was even more of a challenge). A compounding factor was also that in a run-up to the OS release, due to accidental supply chain shortages in Korea memory prices had skyrocketed. As a result, in the first two years of the OS PC's had way less memory than anticipated, and hence, weren't meeting the sweet-spots for which the workloads had been tuned.