A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Homebuilt PC's
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Monitor question



 
 
Thread Tools Display Modes
  #1  
Old March 15th 21, 05:25 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
uses 8GB of shared system memory. According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results? By rough math, I
think I would use about 4 times as much memory (but I don't know how to
see how much I am actually using now!) That's assuming that 4k uses
32-bit color too, Mainly I like "sharp crisp text". Somehow,after
reading some reviews, I ended up considering the Dell 2721Q monitor
(which is almost $500). It seems as you get bigger screens you need
finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
following GPUs in the news knows that this is a rather poor time to be
in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill
  #2  
Old March 15th 21, 05:37 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:
I have a Asus Strix 750ti GPU.Â* It has 2GB of onboard video memory and
uses 8GB of shared system memory.Â* According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results?Â* By rough math, I
think I would use about 4 times as much memory (but I don't know how to
see how much I am actually using now!)Â*Â* That's assuming that 4k uses
32-bit color too,Â*Â* Mainly I like "sharp crisp text".Â* Somehow,after
reading some reviews, I ended up considering the Dell 2721Q monitor
(which is almost $500).Â* It seems as you get bigger screens you need
finer resolution to get "sharp crisp text"! ; )Â*Â* (duh!)Â*Â* Anyone
following GPUs in the news knows that this is a rather poor time to be
in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.
  #3  
Old March 15th 21, 06:36 AM posted to alt.comp.hardware.pc-homebuilt
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default Monitor question

Bill wrote:
Bill wrote:
I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
uses 8GB of shared system memory. According to its specifications, it
can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results? By rough math, I
think I would use about 4 times as much memory (but I don't know how
to see how much I am actually using now!) That's assuming that 4k
uses 32-bit color too, Mainly I like "sharp crisp text".
Somehow,after reading some reviews, I ended up considering the Dell
2721Q monitor (which is almost $500). It seems as you get bigger
screens you need finer resolution to get "sharp crisp text"! ; )
(duh!) Anyone following GPUs in the news knows that this is a rather
poor time to be in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.


You would check the standards support of your existing video card,
and see if the standard supports 60Hz operation at the
resolution of interest. Wikipedia articles on HDMI and DisplayPort,
have various tables for this issue.

One possible issue, is the version of HDCP. They don't figure it out
here, so I don't know what the issue is exactly. Will the OS agree
to run 4K without HDCP ? The hardware likely allows it, but the OS
tunes for max DRM. Video cards have had "added evil" to stop copying,
and the OS only has to tap into those calls, to sew up the copy hole.

https://forums.tomshardware.com/thre...50-ti.2621525/

And no, I don't have a 4K monitor here. I've got two monitors on my
desk (run by two computers), and there isn't room for some huge
monitor. The reason the two monitors are on my desk, is the second
computer runs video conference, and the second computer is
further away, and less noise into my microphone. My lashup is there
so I can video conference, without fan noise from the first computer
being powered up.

I will be glad when some day, this video conference fetish will
have ended. The last video conference was a flop, when I couldn't
log into the damn thing. I had to take a phone call instead
(which as it happens, is all that was required anyway).

Paul
  #4  
Old March 15th 21, 03:42 PM posted to alt.comp.hardware.pc-homebuilt
Larc[_3_]
external usenet poster
 
Posts: 383
Default Monitor question

On Mon, 15 Mar 2021 00:25:57 -0400, Bill wrote:

| I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
| uses 8GB of shared system memory. According to its specifications, it
| can apparently handle 3840x2160 resolution.
|
| I have been happily running a monitor (Dell UltraSharp 2407wfp) at
| 1900x1200 native resolution. It's got about 12 years on it. I don't
| really do computing which pushes the video card harder than casual games.
|
| If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
| likely be pleased or unsatisfied with the results? By rough math, I
| think I would use about 4 times as much memory (but I don't know how to
| see how much I am actually using now!) That's assuming that 4k uses
| 32-bit color too, Mainly I like "sharp crisp text". Somehow,after
| reading some reviews, I ended up considering the Dell 2721Q monitor
| (which is almost $500). It seems as you get bigger screens you need
| finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
| following GPUs in the news knows that this is a rather poor time to be
| in the market for a GPU.
|
| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc
  #5  
Old March 15th 21, 09:22 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Paul wrote:
Bill wrote:
Bill wrote:
I have a Asus Strix 750ti GPU.Â* It has 2GB of onboard video memory
and uses 8GB of shared system memory.Â* According to its
specifications, it can apparently handle 3840x2160 resolution.

I have been happily running a monitor (Dell UltraSharp 2407wfp) at
1900x1200 native resolution. It's got about 12 years on it. I don't
really do computing which pushes the video card harder than casual
games.

If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
likely be pleased or unsatisfied with the results?Â* By rough math, I
think I would use about 4 times as much memory (but I don't know how
to see how much I am actually using now!)Â*Â* That's assuming that 4k
uses 32-bit color too,Â*Â* Mainly I like "sharp crisp text".
Somehow,after reading some reviews, I ended up considering the Dell
2721Q monitor (which is almost $500).Â* It seems as you get bigger
screens you need finer resolution to get "sharp crisp text"! ; )
(duh!)Â*Â* Anyone following GPUs in the news knows that this is a
rather poor time to be in the market for a GPU.

Any comments or suggestions based upon your experience is welcome!

Bill



P.S. I should add that I would intend to use DisplayPort (v 1.4)
connector instead of DVI.


You would check the standards support of your existing video card,
and see if the standard supports 60Hz operation at the
resolution of interest. Wikipedia articles on HDMI and DisplayPort,
have various tables for this issue.

One possible issue, is the version of HDCP. They don't figure it out
here, so I don't know what the issue is exactly. Will the OS agree
to run 4K without HDCP ? The hardware likely allows it, but the OS
tunes for max DRM. Video cards have had "added evil" to stop copying,
and the OS only has to tap into those calls, to sew up the copy hole.

https://forums.tomshardware.com/thre...50-ti.2621525/


And no, I don't have a 4K monitor here. I've got two monitors on my
desk (run by two computers), and there isn't room for some huge
monitor.


In my "office" I have 2 desks, I have a desk with a computer on it and a
kitchen table I have been using as a desk for 36 years now--time flies!
Your post is helpful! If will at the very minimum motivate me to learn
what HDCP is---I've run in that before! : ) Paying a premium price
for a monitor to run at a non-premium resolution doesn't make sense, so
I will get this sorted out.

I strive to build quiet computers. I start with a quiet power supply and
include quiet GPU (with "semi-passive cooling"). It's fan only kicks in
when it needs to. There is a bit more to the complete strategy than
this, but it starts during the "design" stage. You know more about
computer than I do, so there is no sense in me rambling on... ; )
Everything is a compromise...




The reason the two monitors are on my desk, is the second
computer runs video conference, and the second computer is
further away, and less noise into my microphone. My lashup is there
so I can video conference, without fan noise from the first computer
being powered up.

I will be glad when some day, this video conference fetish will
have ended. The last video conference was a flop, when I couldn't
log into the damn thing. I had to take a phone call instead
(which as it happens, is all that was required anyway).

Â*Â* Paul


  #6  
Old March 15th 21, 09:27 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Larc wrote:
On Mon, 15 Mar 2021 00:25:57 -0400, Bill wrote:

| I have a Asus Strix 750ti GPU. It has 2GB of onboard video memory and
| uses 8GB of shared system memory. According to its specifications, it
| can apparently handle 3840x2160 resolution.
|
| I have been happily running a monitor (Dell UltraSharp 2407wfp) at
| 1900x1200 native resolution. It's got about 12 years on it. I don't
| really do computing which pushes the video card harder than casual games.
|
| If I were to upgrade to a 4k monitor (3840x2160 resolution), would I
| likely be pleased or unsatisfied with the results? By rough math, I
| think I would use about 4 times as much memory (but I don't know how to
| see how much I am actually using now!) That's assuming that 4k uses
| 32-bit color too, Mainly I like "sharp crisp text". Somehow,after
| reading some reviews, I ended up considering the Dell 2721Q monitor
| (which is almost $500). It seems as you get bigger screens you need
| finer resolution to get "sharp crisp text"! ; ) (duh!) Anyone
| following GPUs in the news knows that this is a rather poor time to be
| in the market for a GPU.
|
| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc


Thank you Larc! That is encouraging.
  #7  
Old March 15th 21, 10:05 PM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Larc wrote:

| Any comments or suggestions based upon your experience is welcome!

I had a 27" 4K monitor plugged into the display port on an EVGA GTX 750ti for a few
weeks while I was waiting for another GPU and 3840x2160 @60Hz worked perfectly. It's
not a great setup for gaming, but 4K streaming is outstanding.

Larc


Larc, When you used the DisplayPort, did you have to switch to using the
audio via the monitor (aux?) or were you able to still get audio (5.1 in
my case) directly using the various audio connectors on the card (as
you did when you were using DVI)? To my thinking, it sounds like a lot
to ask to get 5.1 through an "aux" connection. Thanks!

Bill
  #8  
Old March 16th 21, 01:11 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:

Larc, When you used the DisplayPort, did you have to switch to using the
audio via the monitor (aux?) or were you able to still get audio (5.1 in
my case)Â* directly using the various audio connectors on the card (as
you did when you were using DVI)?Â*Â* To my thinking, it sounds like a lot
to ask to get 5.1 through an "aux" connection.Â*Â* Thanks!

Bill



Here is an attempt to clarify my question: Can I use DisplayPort and
the output audio jacks on my GPU at the same time?

BTW, I was found the Dell 4k Model S2721QS to be more inline with my
needs and budget. It's supposed to be around $340, but it's not
currently in stock anywhere. In case, anyone else is looking, at think
it's the "sweetspot" between price and features (if you don't require
USB jacks on your monitor).
  #9  
Old March 16th 21, 02:55 AM posted to alt.comp.hardware.pc-homebuilt
Bill[_41_]
external usenet poster
 
Posts: 24
Default Monitor question

Bill wrote:

Here is an attempt to clarify my question:Â* Can I use DisplayPort and
the output audio jacks on my GPU at the same time?



It occurs to me now that the audio jacks are on the mainboard, so this
should be a no-brainer. It's curious how the GPU could even get the
audio--maybe in a different application of the DisplayPort (i.e. in a
different device)...
  #10  
Old March 16th 21, 03:48 AM posted to alt.comp.hardware.pc-homebuilt
Larc[_3_]
external usenet poster
 
Posts: 383
Default Monitor question

On Mon, 15 Mar 2021 17:05:55 -0400, Bill wrote:

| Larc, When you used the DisplayPort, did you have to switch to using the
| audio via the monitor (aux?) or were you able to still get audio (5.1 in
| my case) directly using the various audio connectors on the card (as
| you did when you were using DVI)? To my thinking, it sounds like a lot
| to ask to get 5.1 through an "aux" connection. Thanks!
|
| Bill

I use the motherboard optical out to a receiver. Actually, I've never used a GPU
audio function even with an HDMI connection, which I prefer to DVI when I can't use
Display Port

Larc
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Monitor Question Bozena Gateway Computers 9 November 28th 08 05:54 PM
monitor question Nospam[_2_] Homebuilt PC's 0 May 16th 07 06:41 PM
using only one monitor question FlipperLipps Ati Videocards 0 November 25th 03 03:00 AM
Monitor/TV question... J.Clarke Overclocking AMD Processors 5 September 21st 03 07:42 AM
LED Monitor question Rocket Gateway Computers 2 July 30th 03 04:42 AM


All times are GMT +1. The time now is 06:46 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.