View Single Post
  #10  
Old January 23rd 19, 07:19 PM posted to alt.comp.hardware.overclocking.amd
Paul[_28_]
external usenet poster
 
Posts: 1,467
Default What B450 MB for ryzen 2200g + 2x4GB kingston hiperx predator?

MaxTheFast wrote:
I've just bought the MSI B450-A PRO because its price went down again within my budget, I just hope this will be a good purchase for my configuration and will keep good for the longest period, I mean reliability.
This is a hi-res pic of the MB's upper side (about 10MB):
https://images.anandtech.com/doci/13...0-a_pro-2d.png

I've found the following pages on this msi's phases management:
https://www.hardwareluxx.de/communit...e-1155146.html
https://nl.hardware.info/reviews/854...et-hoe-zit-het
where we can read this:
PWM-controller: Richtek RT8894A (4+2)
echte Phasen (real phases): 4
highside MOSFET: 2x 4C029N
lowside MOSFET: 2x 4C024N
The 2nd linked page shows a comparative table between this MSI, ASRock Fatal1ty B450 Gaming K4 and Gigabyte B450 Aorus Pro but it's arab for me so I just hope I've got a "peer-to-other" MBs if it's not the best one among the ones I've pointed to.
What can you say about those data about phases?

Yeah I knew and "studied" the "VGA matter" about some cfg. AMD cpu + MBs. Here're some useful informations:
https://forum-en.msi.com/index.php?topic=312357.0
especially:
"My understanding is that on the newer 400 series chipsets, you should be able to use either the HDMI, VGA, or DVI-D port(s).
There's an issue on the older 300 series chipsets when using a Athlon 200GE whereby you may not be able to use the VGA due to how the pinout's have changed on the CPU (AMD's fault there....) and how it's supported.
So....with that being said, the 400 series chipsets shouldn't be a problem support wise. As long as the BIOS is up to date, it should work."
This is a real problem for me because I'd like to build this new PC using an old VGA 1440x900 @60Hz Acer monitor and I've NO monitor carrying on HDMI port! As far as I could understand I'll manage the VGA connection between MSI's MB and Acer by the standard VGA cable only if the MSI will came with a newer bios out of the box. If not I'll have to use a HDMI-VGA active converter as you suggested. I think a HDMI-VGA converter would be the best choice for this situation because I couldn't find any problem about HDMI output around the web while lots of troubles about VGA, DP, DVI-D outputs on those configurations. Due to that I've already bought this converter:
https://www.amazon.com/dp/B01GJO6HPW
So my strategy is: if VGA doesn't work out of the box I'll use HDMI port + converter.
There's only a last matter I couldn't find any answer about: what kind of HDMI signal output will came out from the MSI out of the box? My concern is the msi will "shot" HDMI output at its max values:
"HDMI 1.4 port, supports a maximum resolution of 4096x2160 @30Hz, 2560x1600 @60Hz"
In the case there's the converter I'm afraid it will convert the HDMI max output to VGA values that are too high to be displayed on the Acer monitor and I'll get black screen. In this case I'll run for mayor of "ass ville" (as you said) and I'll have to sell the Acer and buy a used HDMI monitor because my budget is null for now.
Is what I said correct or did I make mistakes in my "studies" and solution strategies? Can you possibly suggest other solutions about that?


The RT8894A is more or less "designed for AMD". Richtek
is traditionally a "low frequency SMPS" maker, not that this
matters. My AthlonXP board had one. Richtek can do around
30-35W per phase (again, not that modern and older are comparable
but at least they've demonstrated the capability in the past).
When RichTek did a two phase, others were doing three or four
phase around the same point in time.

https://www.richtek.com/Products/Vco...specid=RT8894A

Their diagram is a bit weird. The chip is supposed to be a 4+2,
yet the right hand side four phases, three phases are direct drive
and the fourth phase uses an RT9624A. The purpose of those eight
pin "pre-drive" chips are as a buffer, separating the 3000pF MOSFET
gate capacitance from the main regulator chip. When you use pre-drive
chips like that, it makes the voltage regulator run cooler. The pre-drive
chip gets warm, but it's separate from the main chip. You can cool
regulator chips by using a thermal slug on the bottom and
soldering the chip bottom to the motherboard. It's better if all
the phases are buffered, but you can see in the design of the
chip, the chip architecture is designed to hit a price point.
so it's a compromise between lower running temperature, and the
cost of buffering up all the phases. You can probably drive three
phases of large MOSFETs without boiling the main controller.

https://www.richtek.com/~/media/Rich...57744ctqaa.GIF

It's better if the MOSFETs have a good sized heatsink. The chip has
sensing, and can probably sense phasing properly. If I could find
the PDF datasheet, I could check if the stuff on the left is
intended for thermistor input (as then the circuit can monitor
operating temperatures). On a P4 board I own here, they had
a regulator with temperature compensation capability and they
"skipped" using the thermistor, and on that design, that adds
around a 50mV error during CPU load step changes (might affect
attempts to overclock and on an enthusiast class board too). It's
really better if the "optional gubbins" are installed on stuff like
that.

*******

As far as resolution setting, even if an adapter is in the path,
the EDID serial clock and data are passed through to the motherboard.
The motherboard can "read" the EDID. If the EDID says "I'm a
1440x900 monitor", whether it's a CRT or an LCD, the board will
set a resolution according to the "monitor declaration". The
max res a monitor should show, is its "native resolution".
Even if the HDMI puts out 4000x3000, once the EDID is read,
the HDMI mode line will be set to 1440x900 as you would expect.

Where you end up in trouble, is with "projectors". We had one
at the office, a projector that takes a laptop VGA and projects
a picture onto a meeting room screen. On a lot of those, there
is *no* EDID. When no EDID is detected, it's an industry tradition
to not blow up any fixed sync monitors, and they choose values
like 1024x768 or 1152x870 or so. They specifically don't allow
an EDID-free setup to run 4000x3000 by accident. This feature
is both a blessing and a curse of course... a blessing when
some poorly designed display device is not ruined, but a
curse the rest of the time.

It's possible to buy an "EDID faker" box for $50, which you
place between the laptop and the projector. And there are
various programming options via the box. Some boxes, you
can connect them to a regular monitor, and have the
"EDID faker" copy the resolution table in write mode.
Then for the rest of its life, the EDID faker box
runs in read-only mode, telling the world the "projector
runs at 1440x900". I haven't seen those boxes lately and
assume you can still buy them, but the market today
would be smaller. And with the deprecation of VGA as
a standard, a lot of the faker boxes would be from
the VGA era.

The only unanswered question, is why were the damn projectors
made that way in the first place ? Why avoid putting a $2 chip
in a $1000 projector, forcing a customer to spend $50 for a box
and power supply with a $2 chip inside ? :-/ The engineers
who did that should be taken off to the crazy house.

I think you're in good shape on the VGA. As long as the
VGA isn't "outright dead", you now have a flow chart
of what to do about it. And that was my main concern,
that VGA was being put on the motherboard, and in
2019, that connector really shouldn't be on the
back of the system (forcing you to buy the HDMI-VGA
active adapter, right away, instead of it being
a conditional purchase).

Paul