Hey everyone I just went out and upgraded from my 15" HP monitor too a 19" Samsung 932BW widescreen monitor. Boy was I excited....
Then I got home and hooked it up, I was having trouble turning it on at first, then finally it starts to be cooperative and turn on.
I plug in DVI because I hear that has the nicest picture.
But for some strange reason, when I hook up DVI it uses up a lot of my computers resources. 8-40% at all times, with all programs closed and any third party programs closed as well. Why would DVI make me use system resources from my computer?
In system task manager it says "System" and "explorer.exe" are what making it lag so much.
Not to mention in the games I play, I'm noticing a huge drop in FPS.
The strange thing is, its ONLY DVI that makes it do that. When I plug in analog it runs about 2-5% resources at all time. Which still doesn't make much sense too me why a monitor would use up system resources.
Its not a virus because switching inputs makes a noticble difference...
PLEASE help me figure out what the problem could be.
I updated the drivers to my video card
my video card is a Nvidia GeForce 7600 GS
I have updated the drivers for the specific model number of my monitor (which ironically enough, is way worse than I started so I uninstalled them)
Why would a new monitor be causing to use so many system resources? Even more, why would switching inputs have much to do with it either?
If its my outputs I will return it and get a new one, but that doesn't seem likely too me =[
P.S.- First post, thanks for any and all help.