If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
"Sparkles" in image with GeForce Ti4200
I have a PC with a GeForce Ti4200 that I use with a KVM DVI/USB switch to
share an LCD monitor, USB keyboard and mouse that I use with another PC with an ATI X800 XT Platinum. With my PC with the GeForce (call it my "work" PC) I get "sparkles" in graphic images. Essentially they almost look like LCD spots that are always on (white). However, when I switch to my PC with my ATI card (call it my "game" PC) everything looks fine when viewing the same image. The ATI control panel has an option for "Alternate DVI mode" and "Reduce frequency for DVI" which has eliminated any corruption that might have existed (next to none). I have tried swapping cables between the two PC's and to the monitor, and tried all new cables altogether with no success. Same thing happens with another KVM switch for DVI and PS/2 devices. So I feel it has to do with the GeForce video card. Has anyone had any issues like this and/or how to fix it? If I can't eliminate it I may have to go with an ATI card since it seems to work fine on my other PC. I hate to spend the money however, and my Ti4200 works just fine otherwise. Thanks for any assistance! |
#2
|
|||
|
|||
On 5/5/2005 2:45 PM HockeyTownUSA brightened our day with:
I have a PC with a GeForce Ti4200 that I use with a KVM DVI/USB switch to share an LCD monitor, USB keyboard and mouse that I use with another PC with an ATI X800 XT Platinum. With my PC with the GeForce (call it my "work" PC) I get "sparkles" in graphic images. Essentially they almost look like LCD spots that are always on (white). However, when I switch to my PC with my ATI card (call it my "game" PC) everything looks fine when viewing the same image. The ATI control panel has an option for "Alternate DVI mode" and "Reduce frequency for DVI" which has eliminated any corruption that might have existed (next to none). I have tried swapping cables between the two PC's and to the monitor, and tried all new cables altogether with no success. Same thing happens with another KVM switch for DVI and PS/2 devices. So I feel it has to do with the GeForce video card. Has anyone had any issues like this and/or how to fix it? If I can't eliminate it I may have to go with an ATI card since it seems to work fine on my other PC. I hate to spend the money however, and my Ti4200 works just fine otherwise. Thanks for any assistance! Where are you getting the sparklies? 3D stuff or what? If you're looking at a web page with a picture on it does it look weird? Texture seem sparklies in 3D games, especially newer games, are sometimes just a fact of an older card not having the hardware to render everything without errors. Sometimes it's caused by heat. If you're seeing artifacts in regular video or imaging software, the 4200 may be in trouble. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. Steve ¤»Inglo«¤ www.inglostadt.com |
#3
|
|||
|
|||
"Inglo" ioo@??.¿¿¿ wrote in message om... On 5/5/2005 2:45 PM HockeyTownUSA brightened our day with: I have a PC with a GeForce Ti4200 that I use with a KVM DVI/USB switch to share an LCD monitor, USB keyboard and mouse that I use with another PC with an ATI X800 XT Platinum. With my PC with the GeForce (call it my "work" PC) I get "sparkles" in graphic images. Essentially they almost look like LCD spots that are always on (white). However, when I switch to my PC with my ATI card (call it my "game" PC) everything looks fine when viewing the same image. The ATI control panel has an option for "Alternate DVI mode" and "Reduce frequency for DVI" which has eliminated any corruption that might have existed (next to none). I have tried swapping cables between the two PC's and to the monitor, and tried all new cables altogether with no success. Same thing happens with another KVM switch for DVI and PS/2 devices. So I feel it has to do with the GeForce video card. Has anyone had any issues like this and/or how to fix it? If I can't eliminate it I may have to go with an ATI card since it seems to work fine on my other PC. I hate to spend the money however, and my Ti4200 works just fine otherwise. Thanks for any assistance! Where are you getting the sparklies? 3D stuff or what? If you're looking at a web page with a picture on it does it look weird? Texture seem sparklies in 3D games, especially newer games, are sometimes just a fact of an older card not having the hardware to render everything without errors. Sometimes it's caused by heat. If you're seeing artifacts in regular video or imaging software, the 4200 may be in trouble. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. Steve ¤»Inglo«¤ www.inglostadt.com No, actually, just in Windows. My wallpaper exhibits it. Funny thing is, if I switch to 16bit instead of 32bit or a lower resolution, it disappears. On my other PC with the ATI, I realized I do get it there too, but just select the "reduce DVI frequency" and all is well. I guess I could just run in 16bit, but I hate it. It causes banded colors on everything since everything is now done in 32 bit color. My thought is that the KVM is at its limit. It is rated for maximum 1600x1200 which is what I run at. Problem is I cannot find another DVI / USB KVM on the planet besides this one and the Belkin one. And the Belkin was even worse! Damn thing was so finicky, half the time my screen wouldn't even display. sigh If anyone knows of another quality DVI / USB KVM, I would appreciate any links where I can buy one. Thanks! |
#4
|
|||
|
|||
On 5/6/2005 2:25 PM HockeyTownUSA brightened our day with:
No, actually, just in Windows. My wallpaper exhibits it. Funny thing is, if I switch to 16bit instead of 32bit or a lower resolution, it disappears. On my other PC with the ATI, I realized I do get it there too, but just select the "reduce DVI frequency" and all is well. I guess I could just run in 16bit, but I hate it. It causes banded colors on everything since everything is now done in 32 bit color. My thought is that the KVM is at its limit. It is rated for maximum 1600x1200 which is what I run at. Problem is I cannot find another DVI / USB KVM on the planet besides this one and the Belkin one. And the Belkin was even worse! Damn thing was so finicky, half the time my screen wouldn't even display. sigh If anyone knows of another quality DVI / USB KVM, I would appreciate any links where I can buy one. Thanks! My monitor has two DVI inputs and 1 VGA, no need for a KVM switch. Sell your monitor to someone who doesn't need to plug multiple PCs into it and buy one like mine, happens to be a Sony. Then just use the KVM switch for the K and M. I'm constantly hooking up various different computers to this monitor, I had no idea when I bought it how useful the extra two inputs would be. Wasn't even a selling point, turned out to be a necessity. You just hit a little input button on the front of the monitor to cycle between the three inputs. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. The Official God FAQ: http://www.400monkeys.com/God/ Steve ¤»Inglo«¤ www.inglostadt.com |
#5
|
|||
|
|||
"Inglo" ioo@??.¿¿¿ wrote in message m... On 5/6/2005 2:25 PM HockeyTownUSA brightened our day with: No, actually, just in Windows. My wallpaper exhibits it. Funny thing is, if I switch to 16bit instead of 32bit or a lower resolution, it disappears. On my other PC with the ATI, I realized I do get it there too, but just select the "reduce DVI frequency" and all is well. I guess I could just run in 16bit, but I hate it. It causes banded colors on everything since everything is now done in 32 bit color. My thought is that the KVM is at its limit. It is rated for maximum 1600x1200 which is what I run at. Problem is I cannot find another DVI / USB KVM on the planet besides this one and the Belkin one. And the Belkin was even worse! Damn thing was so finicky, half the time my screen wouldn't even display. sigh If anyone knows of another quality DVI / USB KVM, I would appreciate any links where I can buy one. Thanks! My monitor has two DVI inputs and 1 VGA, no need for a KVM switch. Sell your monitor to someone who doesn't need to plug multiple PCs into it and buy one like mine, happens to be a Sony. Then just use the KVM switch for the K and M. I'm constantly hooking up various different computers to this monitor, I had no idea when I bought it how useful the extra two inputs would be. Wasn't even a selling point, turned out to be a necessity. You just hit a little input button on the front of the monitor to cycle between the three inputs. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. The Official God FAQ: http://www.400monkeys.com/God/ Steve ¤»Inglo«¤ www.inglostadt.com This monitor supports one DVI, VGA, S-Video, and Composite input which can be cycled by a button in the front. Unfortunately I've been spoiled by DVI, otherwise I could connect one PC to the VGA input, but I hate to say it, there is a significant difference in image quality, especially on an LCD. DVI is so crisp and clear. VGA looks so blurry when compared to it. It would be nice to have dual DVI inputs and an S-video becauase I hook my XBOX up to this as well via S-video. So it is the perfect monitor for me (almost, except I'd prefer dual DVI By the way, what is the model of your Sony? All the Sony's I've found have only a single DVI input with one or two D-sub (VGA) inputs. |
#6
|
|||
|
|||
"HockeyTownUSA" wrote in message ... "Inglo" ioo@??.¿¿¿ wrote in message m... On 5/6/2005 2:25 PM HockeyTownUSA brightened our day with: No, actually, just in Windows. My wallpaper exhibits it. Funny thing is, if I switch to 16bit instead of 32bit or a lower resolution, it disappears. On my other PC with the ATI, I realized I do get it there too, but just select the "reduce DVI frequency" and all is well. I guess I could just run in 16bit, but I hate it. It causes banded colors on everything since everything is now done in 32 bit color. My thought is that the KVM is at its limit. It is rated for maximum 1600x1200 which is what I run at. Problem is I cannot find another DVI / USB KVM on the planet besides this one and the Belkin one. And the Belkin was even worse! Damn thing was so finicky, half the time my screen wouldn't even display. sigh If anyone knows of another quality DVI / USB KVM, I would appreciate any links where I can buy one. Thanks! My monitor has two DVI inputs and 1 VGA, no need for a KVM switch. Sell your monitor to someone who doesn't need to plug multiple PCs into it and buy one like mine, happens to be a Sony. Then just use the KVM switch for the K and M. I'm constantly hooking up various different computers to this monitor, I had no idea when I bought it how useful the extra two inputs would be. Wasn't even a selling point, turned out to be a necessity. You just hit a little input button on the front of the monitor to cycle between the three inputs. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. The Official God FAQ: http://www.400monkeys.com/God/ Steve ¤»Inglo«¤ www.inglostadt.com This monitor supports one DVI, VGA, S-Video, and Composite input which can be cycled by a button in the front. Unfortunately I've been spoiled by DVI, otherwise I could connect one PC to the VGA input, but I hate to say it, there is a significant difference in image quality, especially on an LCD. DVI is so crisp and clear. VGA looks so blurry when compared to it. It would be nice to have dual DVI inputs and an S-video becauase I hook my XBOX up to this as well via S-video. So it is the perfect monitor for me (almost, except I'd prefer dual DVI By the way, what is the model of your Sony? All the Sony's I've found have only a single DVI input with one or two D-sub (VGA) inputs. Here's what the "sparkles" look like. Image taken from nzone.com website: http://home.comcast.net/~fighterpilo...rkle-nzone.jpg |
#7
|
|||
|
|||
Here's what the "sparkles" look like. Image taken from nzone.com website: http://home.comcast.net/~fighterpilo...rkle-nzone.jpg Forgot to say look at the "chrome" around the nvidia logo. |
#8
|
|||
|
|||
On 5/6/2005 6:59 PM HockeyTownUSA brightened our day with:
Here's what the "sparkles" look like. Image taken from nzone.com website: http://home.comcast.net/~fighterpilo...rkle-nzone.jpg Forgot to say look at the "chrome" around the nvidia logo. What happens when you plug them directly into the monitor, bypassing the KVM switch, does it go away? I understand your preference for DVI but if it was me I'd just plug the 4200 into the VGA input on the monitor. I'd think that even with the slight loss of signal going VGA would cause it would be preferable to that. Oh and it looks like I just have a DVI adapter plugged into a VGA slot on the back, just one DVI input two VGA, I was just looking at the two empty cables sitting on the floor, forgot how I'd connected them. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. The Official God FAQ: http://www.400monkeys.com/God/ Steve ¤»Inglo«¤ www.inglostadt.com |
#9
|
|||
|
|||
"Inglo" ioo@??.¿¿¿ wrote in message m... On 5/6/2005 6:59 PM HockeyTownUSA brightened our day with: Here's what the "sparkles" look like. Image taken from nzone.com website: http://home.comcast.net/~fighterpilo...rkle-nzone.jpg Forgot to say look at the "chrome" around the nvidia logo. What happens when you plug them directly into the monitor, bypassing the KVM switch, does it go away? I understand your preference for DVI but if it was me I'd just plug the 4200 into the VGA input on the monitor. I'd think that even with the slight loss of signal going VGA would cause it would be preferable to that. Oh and it looks like I just have a DVI adapter plugged into a VGA slot on the back, just one DVI input two VGA, I was just looking at the two empty cables sitting on the floor, forgot how I'd connected them. -- People of the United States! We are Unitarian Jihad! We can strike without warning. Pockets of reasonableness and harmony will appear as if from nowhere! Nice people will run the government again! There will be coffee and cookies in the Gandhi Room after the revolution. The Official God FAQ: http://www.400monkeys.com/God/ Steve ¤»Inglo«¤ www.inglostadt.com Well, I'm just returning this KVM and reverting back to my trusty DVI/PS2 KVM. Picture looks great and works great. I found another keyboard that I like that is PS/2 and my Logitech Mx518 works fine with a PS/2 adapter. |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
nvidia display issues | news.socket.net | Nvidia Videocards | 1 | March 17th 05 11:57 PM |
Debian Linux and a GeForce 4 Ti4200 | Richard Cavell | Nvidia Videocards | 2 | August 12th 04 08:53 PM |
Geforce 8x Ti4200 | Just Wondering | Nvidia Videocards | 7 | November 13th 03 02:23 AM |
Geforce ti4200 and DX9 | Derek | Nvidia Videocards | 4 | October 30th 03 07:34 AM |
GeForce ti4200 Blank Screen on high resolution | Bratboy | Nvidia Videocards | 0 | July 10th 03 02:58 PM |