Ask the expert: should I turn off UAC?

upgrade7.jpg

If cross platform gaming were a horse race, the PC would be a thoroughbred. Faster, more powerful and with a penchant for expensive oats. Like any infinitely rarified animal or highly tuned machine, though, it doesn't take much to cause it damage. Our resident tech vet Adam Oxford is here to put it out of your misery.

UAC: good or bad?

I was having an argument about the benefits of UAC in Windows Vista/7, I am of the opinion that it does a good job and anyone turning it off is asking for trouble. I compared not using it to logging in as root on Linux for everyday use. Could you clear this up please.

jon_hill987

PCG : Like most people, I've come full circle on this one. When Vista launched, UAC was a good idea poorly executed. It prevents programs from installing or altering system files without explicit approval from the user and helps to stop you accidentally installing malware on your system, so it's a good thing. But because it popped up so many times and needed so many clicks to clear, turning it off and relying on firewalls and AV for security was enough for me.

Under Windows 7, it's far less intrusive so I leave it on all the time. But it still bounces up more often than, say, Ubuntu's root prompt. Why? Because Windows doesn't make the same strong distinctions between user and root space as Linux. For example, logic says an app like Core Temp which reads CPU sensors shouldn't need to access UAC protected areas every time it's run. But because of the way Windows is constructed, it does. On the other hand, there are a good many Linux apps which are buggy because they don't have root access – if you want to use the GUI to change a graphics driver setting, for example. So there's no perfect solution just yet.

Why don't graphics chips cook?

My question isn't really anything serious, more of a curiosity. I was wondering why a GPU's thermal rating is always so much higher than a CPU? For instance, Nvidia's new Fermi cards will safely run at temps above 90C without breaking a sweat! Nvidia even lists the maximum temp at 105C on their website.While at the same time, even the most adventurous overclocker won't push a CPU's temps much above 70C under load. Why is that? It just seems like there shouldn't be that large of a discrepancy.

Makius

PCG : Good question. There are differences in the way that the two types of processor are manufactured which affect performance at temperature, but I put the fundamental part of the question to Lars Weinand, Senior Technical Marketing Manager at NVIDIA.

“A CPU is much more error critical than a GPU running graphics,” he explained, “For example, if the CPU gives a wrong value when running something as 'simple' as Excel, then the whole OS can crash whereas a slightly wrong pixel colour when a GPU is pushing out millions upon millions of pixels at once does not cause a game to crash. This is simplifying things, but the end result is that a GPU has a much higher thermal range. Also, half of the die space on the CPU is dedicated to cache and its few cores have long pipelines which creates hotspots where heat builds up. On a GPU the heat is shared over a larger die area hence GPUs having a higher heat threshold."

Giving it the boot

At reboot, or normal boot-up, I come to a black and white screen with roughly the words "Boot error, please insert disc and hit Enter." So essentially, it again does not bring me to the "which OS do you want screen", just this new screen. So, I put in the Win 7 disc, and did a hard reboot. As expected, after reboot, the next words to come up were "Hit any key to boot from CD or DVD...", but to test things, I hit NO KEY. I let it pass. Guess what? The two choices screen came up! "Earlier version of Windows, or Windows 7". Very perplexed, I chose Win 7, and it boots up no problem. So, to test it out again, I took out the 7 DVD, and rebooted. Wham, same problem black and white screen: "Boot error, please insert disc and hit Enter". So I did, but this time I DID hit a key, went into 7 setup, and tried the "Win 7 repair startup" option, and rebooted w/out the disc and tried it again; again, SAME "boot error" message! But when I put the Win 7 disc in, hard reboot, let it come back up, display the "Hit any key to boot from DVD" and I SKIP THIS (hitting no key), it then goes to the "correct" screen of offering me two choices. I hit Win 7 every time, and it boots up fine.

The_Terminator

PCG : The_Terminator's question comes from our excellent new tech forum which is well worth checking out as a first point of call for your problems. There's some more background to the issue and some excellent advice for repairing damage to the Master Boot Record (MBR) of a damaged Windows installation. I think there may be a simpler solution, though. It sounds as if the hard drive order got swapped around during a BIOS flash. Even though nothing has changed physically, if your XP hard drive has been set ahead of the Windows 7 one in the Boot Order, it would account for nearly all the problems you describe here.

TOPICS
PCGamer

PC Gamer is the global authority on PC games—starting in 1993 with the magazine, and then in 2010 with this website you're currently reading. We have writers across the US, Canada, UK and Australia, who you can read about here.