A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit



 
 
Thread Tools Display Modes
  #1  
Old February 21st 18, 06:04 AM posted to alt.comp.hardware
t
external usenet poster
 
Posts: 77
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

The current graphic card is AMD Firepro 4800
https://www.amd.com/Documents/ati-fi...-datasheet.pdf

It was working OK till now, but now we are seeing many lines on the
desktop and lot of things are like redacted text.

I updated the graphic card driver to the latest one from
https://support.amd.com/en-us/kb-art...s-drivers.aspx
which fixed the issue for a day, then the lines started appearing again.

It had a couple of Blue Screen of Death crashes. I looked at the crash
dump files and the issue was Atikmdag.sys. I updated the driver for AMD
Firepro 4800, but the issue persists.

The user claims no new hardware/software was installed in last few
months. The operating system is Windows 7 64 bit and the graphic card is
supporting two Dell P2213 monitors and a NEC MultiSync V463 for past 2-3
years. The operating system has the required Windows patches and the
workstation is running 24 X 7. It is used as the front end to view
heating/cooling devices in nearby buildings. It has 8GB RAM and Resource
monitor did not have any unusual load.

I did a anti-virus scan, but did not find any malware. I checked for
device conflicts in Device manager, but did not find any.

1. What could be causing the issues?

2. Can upgrading the graphic card resolve it? If so, would Nvidia Quadro
K1200 Low Power, Low Profile or EVGA GeForce GTX 1070 SC GAMING ACX 3.0,
8GB GDDR5 or GeForce GTX 1050 Ti 4GB GDDR5 work for such a situation?


Any suggestions would be helpful.
  #2  
Old February 21st 18, 07:08 AM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

t wrote:
The current graphic card is AMD Firepro 4800
https://www.amd.com/Documents/ati-fi...-datasheet.pdf

It was working OK till now, but now we are seeing many lines on the
desktop and lot of things are like redacted text.

I updated the graphic card driver to the latest one from
https://support.amd.com/en-us/kb-art...s-drivers.aspx
which fixed the issue for a day, then the lines started appearing again.

It had a couple of Blue Screen of Death crashes. I looked at the crash
dump files and the issue was Atikmdag.sys. I updated the driver for AMD
Firepro 4800, but the issue persists.

The user claims no new hardware/software was installed in last few
months. The operating system is Windows 7 64 bit and the graphic card is
supporting two Dell P2213 monitors and a NEC MultiSync V463 for past 2-3
years. The operating system has the required Windows patches and the
workstation is running 24 X 7. It is used as the front end to view
heating/cooling devices in nearby buildings. It has 8GB RAM and Resource
monitor did not have any unusual load.

I did a anti-virus scan, but did not find any malware. I checked for
device conflicts in Device manager, but did not find any.

1. What could be causing the issues?

2. Can upgrading the graphic card resolve it? If so, would Nvidia Quadro
K1200 Low Power, Low Profile or EVGA GeForce GTX 1070 SC GAMING ACX 3.0,
8GB GDDR5 or GeForce GTX 1050 Ti 4GB GDDR5 work for such a situation?


Any suggestions would be helpful.


Is the fan still spinning on the video card ?

Video card fans don't typically have tacho output, so there
is no means for hardware to monitor them that way.

Similarly, an overheated video card has no means to turn off the
computer. It's defenseless basically.

It's up to the user to inspect for blocked vents, or verify that
the fan actually spins. If you carry a telescoping inspection mirror, you
might be able to review the fan condition that way, with the side off
the PC.

You can also use utilities like GPUZ or Speedfan, and get a temperature
reading off the video card. And see if it is overheating even when
not switched to 3D mode (clock rate goes up when 3D is called for).

In terms of monitor support, you usually get lassoed into overpowered
cards, by the need for many outputs on the faceplate. For example, a
500 dollar card might have six outputs on the faceplate (two DVI, four
DisplayPort). Lower end cards tend to have a less useful mix of output
ports.

At the current time, new video cards have lost all their VGA capability.
The DVI-I connector has been changed to DVI-D, so you cannot get VGA
that way. You can use an "active" powered DisplayPort to VGA adapter as
a solution. But this adds to the expense of replacing the video card.

A 1030 would probably have sufficient graphics horsepower to drive
the three screens. But you could well be tricked into buying a more
expensive card, just to get a nice selection of ports, plus the
panorama mode to run with three monitors (Eyefinity or whatever
NVidia calls theirs).

*******

https://www.phoronix.com/scan.php?pa..._v7800 &num=1

The ATI FirePro V4800 is also capable of driving up to
three independent displays while its core is based upon
the Redwood XT.

The Redwood XT is the GPU found within the ATI Radeon HD 5670
graphics processor.

With the FirePro V4800 there are 400 stream processors, 57.6GB/s
of memory bandwidth, power consumption of less than 75 Watts,
1GB of GDDR5 memory clocked at 900MHz, and the Redwood XT core
is clocked at 775MHz.

The capabilities don't have to be the same, as the driver can restrict
anything they want it to restrict. An HD5670 won't necessarily work
exactly the same, but would certainly be cheaper. The connector choices
on these are defined by market segment more than anything. And then
you have to pay more (and get more Stream Processors when perhaps
you didn't need them).

https://www.amazon.com/POWERCOLOR-AX...eywords=HD5670

Doesn't list Eyefinity as supported...

https://www.comx-computers.co.za/AX5...y-p-108092.php

With modern enough cards, they don't really have to run hot when not
in 3D mode. So even if a card has a PCIE aux power input, it might
not actually be using all that electricity on a continuous basis.
Back in the 8800GTX era, the "idle" mode ran at 50% of "full power mode"
and the cards really wasted energy. Now, they're better than that.
Some cards could drop to 3W at idle. Some of the tech used, may not
allow hitting those targets on all cards, but at least they no longer
drop to just 50%, and should draw less than 50% at idle.

The next issue will be, AMD may switch to not offering x32 drivers
any more. So if you're buying a brand new card, that's something
else to watch for. And maybe no Win7 drivers ? Buying video cards
now is getting really dangerous. The customers are designed to
get a screwing now. And you have to contend with no stock
at the computer store. They *will* have stock of a $600 card :-)
That's what I discovered in my most recent scan of my
computer store here. Lots of missing SKUs. And then one
card was "stock 10+", which means they have a decent amount
of cards. Too bad the cards are the $600 ones. And these
aren't VEGA cards either, they're ~$250 class cards for ~$600.

And they do have 1030 cards - but too bad the connector mix on
the front isn't all that good. I wish they'd just drop the
pretense and put three DisplayPort on it and be done with it.
Then the user can go shopping for a pile of adapters...

Paul
  #3  
Old February 24th 18, 06:20 AM posted to alt.comp.hardware
t
external usenet poster
 
Posts: 77
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

Thanks Paul,





Is the fan still spinning on the video card ?


It was. I replaced the GPU with another borrowed video card temporarily.


https://www.amazon.com/POWERCOLOR-AX...eywords=HD5670

Thanks, I will look into it.



Doesn't list Eyefinity as supported...

https://www.comx-computers.co.za/AX5...y-p-108092.php


With modern enough cards, they don't really have to run hot when not
in 3D mode. So even if a card has a PCIE aux power input, it might
not actually be using all that electricity on a continuous basis.
Back in the 8800GTX era, the "idle" mode ran at 50% of "full power mode"
and the cards really wasted energy. Now, they're better than that.
Some cards could drop to 3W at idle. Some of the tech used, may not
allow hitting those targets on all cards, but at least they no longer
drop to just 50%, and should draw less than 50% at idle.



Thanks, what other lower priced cards would meet our needs of using two
22 inch Dell mnitors and a 46 inch NEC monitor NEC MultiSync V463?

Would
NVIDIA GeForce GTX 1050 at
https://www.bestbuy.com/site/pny-nvi...lack/5711723.p
220 be OK for our needs?

The next issue will be, AMD may switch to not offering x32 drivers
any more. So if you're buying a brand new card, that's something
else to watch for. And maybe no Win7 drivers ? Buying video cards
now is getting really dangerous. The customers are designed to
get a screwing now. And you have to contend with no stock
at the computer store. They *will* have stock of a $600 card :-)
That's what I discovered in my most recent scan of my
computer store here. Lots of missing SKUs. And then one
card was "stock 10+", which means they have a decent amount
of cards. Too bad the cards are the $600 ones. And these
aren't VEGA cards either, they're ~$250 class cards for ~$600.


I agree, the prices have increased a lot recently.

And they do have 1030 cards - but too bad the connector mix on
the front isn't all that good. I wish they'd just drop the
pretense and put three DisplayPort on it and be done with it.
Then the user can go shopping for a pile of adapters...

Paul


As usual, your in depth guidance and advice is appreciated.
  #4  
Old February 24th 18, 08:41 AM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

t wrote:
Thanks Paul,





Is the fan still spinning on the video card ?


It was. I replaced the GPU with another borrowed video card temporarily.


https://www.amazon.com/POWERCOLOR-AX...eywords=HD5670

Thanks, I will look into it.



Doesn't list Eyefinity as supported...

https://www.comx-computers.co.za/AX5...y-p-108092.php



With modern enough cards, they don't really have to run hot when not
in 3D mode. So even if a card has a PCIE aux power input, it might
not actually be using all that electricity on a continuous basis.
Back in the 8800GTX era, the "idle" mode ran at 50% of "full power mode"
and the cards really wasted energy. Now, they're better than that.
Some cards could drop to 3W at idle. Some of the tech used, may not
allow hitting those targets on all cards, but at least they no longer
drop to just 50%, and should draw less than 50% at idle.



Thanks, what other lower priced cards would meet our needs of using two
22 inch Dell mnitors and a 46 inch NEC monitor NEC MultiSync V463?

Would
NVIDIA GeForce GTX 1050 at
https://www.bestbuy.com/site/pny-nvi...lack/5711723.p
220 be OK for our needs?

The next issue will be, AMD may switch to not offering x32 drivers
any more. So if you're buying a brand new card, that's something
else to watch for. And maybe no Win7 drivers ? Buying video cards
now is getting really dangerous. The customers are designed to
get a screwing now. And you have to contend with no stock
at the computer store. They *will* have stock of a $600 card :-)
That's what I discovered in my most recent scan of my
computer store here. Lots of missing SKUs. And then one
card was "stock 10+", which means they have a decent amount
of cards. Too bad the cards are the $600 ones. And these
aren't VEGA cards either, they're ~$250 class cards for ~$600.


I agree, the prices have increased a lot recently.

And they do have 1030 cards - but too bad the connector mix on
the front isn't all that good. I wish they'd just drop the
pretense and put three DisplayPort on it and be done with it.
Then the user can go shopping for a pile of adapters...

Paul


As usual, your in depth guidance and advice is appreciated.


That card has three different connectors on it. Could be
HDMI, DisplayPort, and some flavor of DVI. Do you think
that'll cover it ?

https://www.pny.com/ProductImages//8...TX-1050-fr.png

The V463 is HD (1920x1080) so isn't going to be a problem
resolution-wise for any of those connectors.

And based on size, a 22" monitor probably isn't a monster
either.

*******

Normally, video cards are dual head. There are two logical
display channels in the card, they feed a crossbar that could
have five connectors on it, and 2-of-5 connectors work.

When Eyefinity came along, it allowed a logical display channel
to feed a 1x3 matrix of displays. The displays would all have
the same native resolution. And the "panorama" would be laid
across the three monitors. The first company to do this
sort of thing, was Matrox, with their external solution for
using one connector to drive two or three monitors.

But something similar was done local to the crossbar block
inside the GPU. I think the AMD implementation allowed
up to six monitors. Implemented as two 1x3 arrays. I can
only guess that this uses up the two logical display
channels. I'm not sure whether NVidia goes head to
head with them, and also supports up to two 1x3 arrays.

I don't really understand the significance of the two logical
display channels. It's existed for a dogs age. Back when
a video card with two connectors was invented, one connector
would do 1600x1200 and the other might do 1024x768. They
didn't even make it back then, so they were both the same.
Back then, making "DACs" was "hard", and for some reason they
didn't like to make two identical ones. So somewhere around
that time, some bright individual decided to drive the two
of them with logical display channels.

Sometimes, you can see latency issues between the two logical
display channels. So you don't really want to use both of
them, if you want a "seamless" display. For example you
could have two CRTs (zero thru-delay) and see one screen
update out of phase with the other screen.

And that's where the Eyefinity concept comes in. The
monitors would be perfectly in sync, because the data
being fed to them is a 5760x1080 block in your case.
And the crossbar makes three 1920x1080 in-phase out of it.
The monitors can still have different "thru-delay". Some
monitors, it takes four frame times for a pixel on the
input connector, to become a pixel on the panel. And
some panels can be faster in thru-delay than others.
This can result in a slight disparity that only a gamer
would notice.

"AMD Eyefinity"
"NVidia Surround"

Check your monitors and make sure the connector mix
is going to work.

http://www.htgsd.com/information-tec...g-with-nvidia/

Paul
  #5  
Old February 24th 18, 09:02 AM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

Paul wrote:


"AMD Eyefinity"
"NVidia Surround"


Now, something I forgot to check, is whether the 1050
actually lists Surround as a feature (oops). It's not on the
PHY web page. Hmmm.

I suspect this page isn't up-to-date.

https://www.geforce.com/hardware/tec...supported-gpus

The "surround configuration tool" does not respond to a selection
of 1050 and single GPU.

https://www.geforce.com/hardware/tec...m-requirements

The tool does respond if you select a 1060. This is the info for a single 1060.

Orientation: Landscape

Accessory Display: Yes, no additional GPU (if using 3 or less displays)

Maximum Resolution: 11520x2160

Maximum Resolution (Bezel Correction): 10240x1600

Maximum number of displays: 2-4 in Surround,
1 Accessory Display when
using 3 or less displays in Surround

The card comes in 3GB and 6GB versions. This is possibly
the one my computer store has "10+" of. The expensive one.

https://www.pny.com/geforce-gtx-1060-3gb

I can't really tell you whether three independent displays would work
on the 1050. It sounds like a violation of the logical display channels
thing. But good documentation on what's behind the crossbar today,
is pretty hard to find. The last AMD picture I have is from the
HD1000 era. I don't know if NVidia even makes an architecture
picture like that for us.

You could run two 1050 cards in non-SLI, but I can't guarantee
the three monitors will update on the same frame boundary.

As for "where did the AMD Vega cards go", Apple came out with
an AIO machine recently, and I think that had a Vega in it.
And they may have got the entire allocation of GPUs or something.
While it's possible coin miners got them, they're "vastly absent"
from the market. I don't think you can get a Frontier one either
(16GB).

Paul
  #6  
Old February 25th 18, 09:21 PM posted to alt.comp.hardware
t
external usenet poster
 
Posts: 77
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

Thanks Paul.

On 2/24/2018 2:41 AM, Paul wrote:

With modern enough cards, they don't really have to run hot when not
in 3D mode. So even if a card has a PCIE aux power input, it might
not actually be using all that electricity on a continuous basis.
Back in the 8800GTX era, the "idle" mode ran at 50% of "full power mode"
and the cards really wasted energy. Now, they're better than that.
Some cards could drop to 3W at idle. Some of the tech used, may not
allow hitting those targets on all cards, but at least they no longer
drop to just 50%, and should draw less than 50% at idle.



Thanks, what other lower priced cards would meet our needs of using
two 22 inch Dell mnitors and a 46 inch NEC monitor NEC MultiSync V463?

Would
NVIDIA GeForce GTX 1050 at
https://www.bestbuy.com/site/pny-nvi...lack/5711723.p
220 be OK for our needs?

The next issue will be, AMD may switch to not offering x32 drivers
any more. So if you're buying a brand new card, that's something
else to watch for. And maybe no Win7 drivers ? Buying video cards
now is getting really dangerous. The customers are designed to
get a screwing now. And you have to contend with no stock
at the computer store. They *will* have stock of a $600 card :-)
That's what I discovered in my most recent scan of my
computer store here. Lots of missing SKUs. And then one
card was "stock 10+", which means they have a decent amount
of cards. Too bad the cards are the $600 ones. And these
aren't VEGA cards either, they're ~$250 class cards for ~$600.


I agree, the prices have increased a lot recently.

And they do have 1030 cards - but too bad the connector mix on
the front isn't all that good. I wish they'd just drop the
pretense and put three DisplayPort on it and be done with it.
Then the user can go shopping for a pile of adapters...

Paul


As usual, your in depth guidance and advice is appreciated.


That card has three different connectors on it. Could be
HDMI, DisplayPort, and some flavor of DVI. Do you think
that'll cover it ?

https://www.pny.com/ProductImages//8...TX-1050-fr.png


The V463 is HD (1920x1080) so isn't going to be a problem
resolution-wise for any of those connectors.

And based on size, a 22" monitor probably isn't a monster
either.

*******

Normally, video cards are dual head. There are two logical
display channels in the card, they feed a crossbar that could
have five connectors on it, and 2-of-5 connectors work.

When Eyefinity came along, it allowed a logical display channel
to feed a 1x3 matrix of displays. The displays would all have
the same native resolution. And the "panorama" would be laid
across the three monitors. The first company to do this
sort of thing, was Matrox, with their external solution for
using one connector to drive two or three monitors.

But something similar was done local to the crossbar block
inside the GPU. I think the AMD implementation allowed
up to six monitors. Implemented as two 1x3 arrays. I can
only guess that this uses up the two logical display
channels. I'm not sure whether NVidia goes head to
head with them, and also supports up to two 1x3 arrays.

I don't really understand the significance of the two logical
display channels. It's existed for a dogs age. Back when
a video card with two connectors was invented, one connector
would do 1600x1200 and the other might do 1024x768. They
didn't even make it back then, so they were both the same.
Back then, making "DACs" was "hard", and for some reason they
didn't like to make two identical ones. So somewhere around
that time, some bright individual decided to drive the two
of them with logical display channels.

Sometimes, you can see latency issues between the two logical
display channels. So you don't really want to use both of
them, if you want a "seamless" display. For example you
could have two CRTs (zero thru-delay) and see one screen
update out of phase with the other screen.

And that's where the Eyefinity concept comes in. The
monitors would be perfectly in sync, because the data
being fed to them is a 5760x1080 block in your case.
And the crossbar makes three 1920x1080 in-phase out of it.
The monitors can still have different "thru-delay". Some
monitors, it takes four frame times for a pixel on the
input connector, to become a pixel on the panel. And
some panels can be faster in thru-delay than others.
This can result in a slight disparity that only a gamer
would notice.

"AMD Eyefinity"
"NVidia Surround"

Check your monitors and make sure the connector mix
is going to work.

http://www.htgsd.com/information-tec...g-with-nvidia/


Your advice and guidance is highly appreciated.

  #7  
Old February 25th 18, 09:28 PM posted to alt.comp.hardware
t
external usenet poster
 
Posts: 77
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

Thanks Paul,

On 2/24/2018 3:02 AM, Paul wrote:
Paul wrote:


"AMD Eyefinity"
"NVidia Surround"


Now, something I forgot to check, is whether the 1050
actually lists Surround as a feature (oops). It's not on the
PHY web page. Hmmm.

I suspect this page isn't up-to-date.

https://www.geforce.com/hardware/tec...supported-gpus

The "surround configuration tool" does not respond to a selection
of 1050 and single GPU.

https://www.geforce.com/hardware/tec...m-requirements

The tool does respond if you select a 1060. This is the info for a
single 1060.

Orientation: Landscape

Accessory Display: Yes, no additional GPU (if using 3 or less
displays)

Maximum Resolution: 11520x2160

Maximum Resolution (Bezel Correction): 10240x1600

Maximum number of displays: 2-4 in Surround,
1 Accessory Display when
using 3 or less displays in Surround

The card comes in 3GB and 6GB versions. This is possibly
the one my computer store has "10+" of. The expensive one.

https://www.pny.com/geforce-gtx-1060-3gb

I can't really tell you whether three independent displays would work
on the 1050. It sounds like a violation of the logical display channels
thing. But good documentation on what's behind the crossbar today,
is pretty hard to find. The last AMD picture I have is from the
HD1000 era. I don't know if NVidia even makes an architecture
picture like that for us.

You could run two 1050 cards in non-SLI, but I can't guarantee
the three monitors will update on the same frame boundary.


As long as it works reasonably, it should be fine. It is for monitoring
temperatures of boilers and chillers.

As for "where did the AMD Vega cards go", Apple came out with
an AIO machine recently, and I think that had a Vega in it.
And they may have got the entire allocation of GPUs or something.
While it's possible coin miners got them, they're "vastly absent"
from the market. I don't think you can get a Frontier one either
(16GB).

Paul


Your advice and support is appreciated. You are a GREAT asset to this
newsgroup!
  #8  
Old February 25th 18, 11:19 PM posted to alt.comp.hardware
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

t wrote:


Your advice and guidance is highly appreciated.


So what have you decided to do ?

The 1060 comes closest to providing a seamless
solution, only using say two-slots of card width
for its heatsink. And the user should never be aware
they're in panorama mode, windows slide from one monitor
to the next seamlessly.

If I decided to not use NVidia Surround, I might
try a couple 1030 cards. But they're still two-slot
due to their heatsinks - I haven't seen thin versions
of those cards. They're the new "low end", and supposed
to be a bit faster than chipset graphics. One of the
consequences of trying to stay away from the performance
level of chipset graphics, is the cards have to be
a bit more powerful than in the past.

There might not be room in the computer for two cards
like that.

If I was doing what you're doing, "monitoring boilers
on my home computer" on three monitors, I might use
a 1030 for two monitors, and one of these for the
third monitor.

This one is VGA.

https://www.startech.com/AV/USB-Vide...ter~USB32VGAES

"This USB video adapter uses a Trigger family chipset.
If you’re connecting this device to a computer along
with additional USB video adapters or docking stations,
please avoid using devices with a DisplayLink or Fresco
family chipset."

This one is HDMI.

https://www.startech.com/AV/USB-Vide...pter~USB32HDES

"This USB video adapter uses a Trigger family chipset.
If you’re connecting this device to a computer along
with additional USB video adapters or docking stations,
please avoid using devices with a DisplayLink or Fresco
family chipset."

There's even one that supports 4K monitors (the horror!).
And it really doesn't cost any extra. Read the reviews
to determine if this level of aggravation is worth it.

"VANTEC NBV-400HU3 USB 3.0 to 4K HDMI Display Adapter DisplayLink Certified "

https://www.newegg.com/Product/Produ...82E16812232062

http://www.vantecusa.com/products_de...name=USB#tab-2

Back when the USB3 generation came out, the word was
"those almost work without using compression". Back
in the USB2 era of DisplayLink, the pixels were severely
compressed. (Even a slide show was a slide show.) Displaying
boiler status would work OK. Trying to watch Netflix on
that screen, not so much. The USB3 version is getting
close to being able to run Netflix without an issue.
If you don't have "real" USB3 ports, then YMMV.

If you're adding USB3 to an existing computer, buy a
USB3.1 Rev2 card (with 10Gbit/sec capability). The idea
is, you don't run it at 10Gbit/sec! The reason for doing it,
is the chip for the 10Gbit/sec version has two PCI Express
lanes (x2 wiring) and you may notice the card has an x4
connector on it. What it is supposed to buy you, is full rate
regular USB3 (i.e. no compromise 5Gbit/sec operation).
One of these days when I get a chance, I plan on locating
one and testing this. As I have a USB3 to SATA adapter
that isn't running full speed (only runs half speed).

Paul
  #9  
Old March 11th 18, 05:10 AM posted to alt.comp.hardware
t
external usenet poster
 
Posts: 77
Default Upgrading graphic card for Dell Optiplex 7010 mini tower runningWindows 7 64 bit

On 2/25/2018 5:19 PM, Paul wrote:
t wrote:


Your advice and guidance is highly appreciated.


So what have you decided to do ?


Get the
https://www.bestbuy.com/site/pny-nvi...lack/5711723.p
as it was cheaper than 1060

Thanks for all your advice!

The 1060 comes closest to providing a seamless
solution, only using say two-slots of card width
for its heatsink. And the user should never be aware
they're in panorama mode, windows slide from one monitor
to the next seamlessly.

If I decided to not use NVidia Surround, I might
try a couple 1030 cards. But they're still two-slot
due to their heatsinks - I haven't seen thin versions
of those cards. They're the new "low end", and supposed
to be a bit faster than chipset graphics. One of the
consequences of trying to stay away from the performance
level of chipset graphics, is the cards have to be
a bit more powerful than in the past.

There might not be room in the computer for two cards
like that.

If I was doing what you're doing, "monitoring boilers
on my home computer" on three monitors, I might use
a 1030 for two monitors, and one of these for the
third monitor.

This one is VGA.

https://www.startech.com/AV/USB-Vide...ter~USB32VGAES


"This USB video adapter uses a Trigger family chipset.
If you’re connecting this device to a computer along
with additional USB video adapters or docking stations,
please avoid using devices with a DisplayLink or Fresco
family chipset."

This one is HDMI.

https://www.startech.com/AV/USB-Vide...pter~USB32HDES


"This USB video adapter uses a Trigger family chipset.
If you’re connecting this device to a computer along
with additional USB video adapters or docking stations,
please avoid using devices with a DisplayLink or Fresco
family chipset."

There's even one that supports 4K monitors (the horror!).
And it really doesn't cost any extra. Read the reviews
to determine if this level of aggravation is worth it.

"VANTEC NBV-400HU3 USB 3.0 to 4K HDMI Display Adapter DisplayLink
Certified "

https://www.newegg.com/Product/Produ...82E16812232062

http://www.vantecusa.com/products_de...name=USB#tab-2

Back when the USB3 generation came out, the word was
"those almost work without using compression". Back
in the USB2 era of DisplayLink, the pixels were severely
compressed. (Even a slide show was a slide show.) Displaying
boiler status would work OK. Trying to watch Netflix on
that screen, not so much. The USB3 version is getting
close to being able to run Netflix without an issue.
If you don't have "real" USB3 ports, then YMMV.

If you're adding USB3 to an existing computer, buy a
USB3.1 Rev2 card (with 10Gbit/sec capability). The idea
is, you don't run it at 10Gbit/sec! The reason for doing it,
is the chip for the 10Gbit/sec version has two PCI Express
lanes (x2 wiring) and you may notice the card has an x4
connector on it. What it is supposed to buy you, is full rate
regular USB3 (i.e. no compromise 5Gbit/sec operation).
One of these days when I get a chance, I plan on locating
one and testing this. As I have a USB3 to SATA adapter
that isn't running full speed (only runs half speed).

Paul


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Graphic card for supporting a 46 inch NEC LCD4615 monitor in a DellOptiplex 790 mini-tower t General 3 January 23rd 17 12:04 PM
using a AT-2701FX fiber Network Interface Card in Dell OptiPlex 3010Mini Tower t General 9 January 14th 13 05:40 PM
Ordering an Optiplex 380 Mini Tower jim_tehma (e-mail isn't checked) Dell Computers 29 April 20th 11 06:21 PM
Which memory for a dell 170L mini tower? [email protected] Dell Computers 5 January 2nd 08 07:23 PM
FS/FA: $65.00 Dell Optiplex Tower Computer System **LOADED** 657898 Jlaenterprises4 Dell Computers 0 June 24th 03 01:04 AM


All times are GMT +1. The time now is 09:30 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.