If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
Are two cards better than one?
After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on
two options: Get a new My setup includes a P4 2.2 with an unknown brand AGP GeForce4 MX440 (Medion computer with MicroStar mb). The existing video card will be used for my secondary monitor and the television. I want to add a second video card (PCI) that will connect to my Dell 19" LCD and I'm not sure which card to buy. Should I get another MX440? Will using the same driver make the system more stable? The Dell monitor has both vga and digital inputs. Should I try to find a card with dvi, or is the image difference negligible? This will be mostly for business stuff and and Paint Shop Pro - no games. Sure would appreciate any recommendations for the second video card. Thanks. |
#2
|
|||
|
|||
Oops! Premature Posting!
Let me try again: After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on two options: Keep my existing unknown brand GeForce4 MX440 in the AGP slot and use it for the secondary monitor and the tv. Get a new card with DVI for the primary monitor. -or- scrap the existing card and get a new card, probably a Gaiward Ultra650 GeForce4 Ti4200 with 128mb. The question: In terms of performance, and most importantly stability, is it preferable to have one card handling two displays, or is it better to have a separate card for each display? The costs are about the same. Thanks |
#3
|
|||
|
|||
Well, I would say, the less you have the less there is to go wrong.
-- Les Ross Certified by a Professional "Koert" wrote in message ... Oops! Premature Posting! Let me try again: After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on two options: Keep my existing unknown brand GeForce4 MX440 in the AGP slot and use it for the secondary monitor and the tv. Get a new card with DVI for the primary monitor. -or- scrap the existing card and get a new card, probably a Gaiward Ultra650 GeForce4 Ti4200 with 128mb. The question: In terms of performance, and most importantly stability, is it preferable to have one card handling two displays, or is it better to have a separate card for each display? The costs are about the same. Thanks |
#4
|
|||
|
|||
"Koert" wrote in message
... Oops! Premature Posting! Let me try again: After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on two options: Keep my existing unknown brand GeForce4 MX440 in the AGP slot and use it for the secondary monitor and the tv. Get a new card with DVI for the primary monitor. -or- scrap the existing card and get a new card, probably a Gaiward Ultra650 GeForce4 Ti4200 with 128mb. The question: In terms of performance, and most importantly stability, is it preferable to have one card handling two displays, or is it better to have a separate card for each display? The costs are about the same. I recommend the GF4 Ti4200 single card. I think 3D acceleration gets disabled when using two cards, but I could be wrong. But anyway, one of these cards will be running on PCI which degrades performance a little. A single card with several outputs gives much less hassle too, draws less power, generates less heat, etc etc. You'll have a hard time finding a single-output DVI card anyway (most come with VGA+DVI or VGA+SVHS+DVI), so you'll probably get a multi-monitor card anyhow! And since you want to use the TV and VGA on one card and DVI on another... I think you'll end up in a configuration nightmare setting that up. /M |
#5
|
|||
|
|||
I am running an Asus v8440 (Geforce4 ti400). It has outputs for two
monitors and is fine when using windowed apps, but when two monitors are installed, it doesn't like full screen apps like games. I got lock-ups and BSOD etc. I ran two monitors for a while, but didn't use the potential enough to put up with the problems. Conrad "Martin Eriksson" wrote in message ... "Koert" wrote in message ... Oops! Premature Posting! Let me try again: After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on two options: Keep my existing unknown brand GeForce4 MX440 in the AGP slot and use it for the secondary monitor and the tv. Get a new card with DVI for the primary monitor. -or- scrap the existing card and get a new card, probably a Gaiward Ultra650 GeForce4 Ti4200 with 128mb. The question: In terms of performance, and most importantly stability, is it preferable to have one card handling two displays, or is it better to have a separate card for each display? The costs are about the same. I recommend the GF4 Ti4200 single card. I think 3D acceleration gets disabled when using two cards, but I could be wrong. But anyway, one of these cards will be running on PCI which degrades performance a little. A single card with several outputs gives much less hassle too, draws less power, generates less heat, etc etc. You'll have a hard time finding a single-output DVI card anyway (most come with VGA+DVI or VGA+SVHS+DVI), so you'll probably get a multi-monitor card anyhow! And since you want to use the TV and VGA on one card and DVI on another... I think you'll end up in a configuration nightmare setting that up. /M |
#6
|
|||
|
|||
I hate to say this, esp here...
but so far my best experience with dual head setups has been with Matrox cards. I have used the G400 dualheads for years, and when I got mine they came with a groovy cable for SVideo out to TV on one of the connectors (2 D-Subs). Great drivers, nice features. Every time I look at the new cards and thier prices, i just go back to my trusty G400. BUT... right now I am running my GF4MX420 AND an old MGA PCI card at the same time to acheive a dualhead setup. So far its good. I game with the Geforce, and use the other display for IM, spreadsheets, and remote desktop sessions to servers... NuTs "Koert" wrote in message ... Oops! Premature Posting! Let me try again: After researching a dual-monitor (2 lcd's and a tv) setup, I've settled on two options: Keep my existing unknown brand GeForce4 MX440 in the AGP slot and use it for the secondary monitor and the tv. Get a new card with DVI for the primary monitor. -or- scrap the existing card and get a new card, probably a Gaiward Ultra650 GeForce4 Ti4200 with 128mb. The question: In terms of performance, and most importantly stability, is it preferable to have one card handling two displays, or is it better to have a separate card for each display? The costs are about the same. Thanks |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
What's the deal with ATI cards and PCI lock-ups? | Bruce Morgen | Ati Videocards | 2 | October 21st 04 09:14 PM |
prices of ATA RAID cards | Timothy Daniels | General | 12 | March 12th 04 10:43 PM |
Why doesn't ATI make VIVO cards anymore , I don't want TV SH*T, just capability of recording VIDEO IN! | [email protected] | Ati Videocards | 10 | January 28th 04 04:43 AM |
Can't Run 2 ATI Radeon Cards in my W2K Box--HELP! | Ken Fox | Ati Videocards | 5 | December 15th 03 03:06 PM |
P4P800Dlx W2K Won't Run 2 Video Cards | Ken Fox | Asus Motherboards | 2 | December 13th 03 08:29 AM |