Display output difference compared to Mac Mini M2.

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
Hi All,

I have found a strange/weird observation and its been bothering me a lot.

I have a mini m2 and a hackintosh with AMD 6800 XT GPU.

I connected both to my TV for testing.

The mac mini m2 produces a very crisp output on the TV and it looks fantastic. I connected the m2 with usb-c to hdmi adapter.

I used the same adapter with my AMD 6800 XT GPU. I connected the DP from GPU to my TItan Ridge card and i took the usb-c output from there and used the same usb-c to hdmi adapter to connect to my TV.

The output from my 6800 XT is not as sharp and crisp as m2 ? Why?

I even tried a DP 1.4 to hdmi 2.1 adapter cable , same output. I also tried hdmi - hdmi cable output.

I made sure to set output to 4k 120hz SDR RBG 10 bit in both cases and the TV confirmed this.

No matter what i do i cannot the same sharp/crisp smooth output like i get from my m2 mini from my hackintosh AMD 6800 XT GPU.

There is clearly a difference in the output keeping all cables/adapters and output parameters the same in both cases.The only difference is the GPU.

Anyone else observed this?

To be honest, i m quite disappointed in my 6800 XT for giving such a sub optimal output compared to the mac mini m2.

I do not have any other monitor to test with.

Please share your thoughts on this.

Thanks.
 

Edhawk

Guru
Guru
Joined
May 2, 2020
Messages
2,376
Simple. You are comparing Apples and Oranges.

The RX 6800 XT is only going to be used via a MacPro7,1 (2019) system, as it is the only current Mac that can take a PCIe dGPU. You are also comparing completely different CPU/GPU mixes.

The M2 Mini is a completely different animal. The drivers for the M2 GPU have been customised and adapted to work at this higher resolution/screen refresh to provide this 'crisper' image.

Does the TV (make & Model?) recognise the M2 Mini and adapt the screen brightness, contrast, colour etc. to reflect the fact an Apple Silicon system is connected? I assume as it is 120Mhz refresh that it is a high end display/TV. Are Apple or the TV/Display manufacturer likely to have added extra kexts/dexts to the OS (assuming Sonoma) so a high end TV/Display of this nature works as well as it can?

The AMD Mackintosh you have built isn't in the same league. The RX 6800 XT hasn't been customised and tweaked to show/enable this 'crisper' image. The AMD drivers in macOS are not the best. They have limited customisation and are probably 3-4 years old, compared to the more recent M2 ARM/Apple Silicon GPU drivers.
 

Edhawk

Guru
Guru
Joined
May 2, 2020
Messages
2,376
I read that ’the code in the AMD kexts simply cannot handle OLED displays’.

Use an app like Lunar to see if this makes the TV/Display show in a better light.

 

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
Yes, i believe the AMD Drivers are not optimized like the M2 GPU drivers. This is evident from the fact that 6800 XT on windows gives a better or crisper image.

I guess the 6000 Series will be the last generation to be supported and that too poorly.

I wish apple would support both Nvidia and AMD GPUs, but not likely to happen in future.

My TV is the sony A95L QD-OLED.
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
Yes, i believe the AMD Drivers are not optimized like the M2 GPU drivers. This is evident from the fact that 6800 XT on windows gives a better or crisper image.

I guess the 6000 Series will be the last generation to be supported and that too poorly.

I wish apple would support both Nvidia and AMD GPUs, but not likely to happen in future.

My TV is the sony A95L QD-OLED.
I have a wide screen Samsung OLED monitor delivering Saturday and will connect it to three different machines. A MacMini M2 connected by a Thunderbolt to DP cable, an AMD as well as an intel powered machine using RX 6950XT graphics cards via Display Port. I'm curious to see if there will be a noticeable difference in display quality.
 

atanvarno

Donator
Donator
AMD OS X Member
Joined
May 2, 2020
Messages
228
The output from my 6800 XT is not as sharp and crisp as m2 ? Why?

I even tried a DP 1.4 to hdmi 2.1 adapter cable , same output. I also tried hdmi - hdmi cable output.

I made sure to set output to 4k 120hz SDR RBG 10 bit in both cases and the TV confirmed this.

No matter what i do i cannot the same sharp/crisp smooth output like i get from my m2 mini from my hackintosh AMD 6800 XT GPU.
Is the macOS the same on both machines? Sonoma makes a huge difference.

You mentioned USB-C to HDMI and also DP 1.4 thus it's not clear to me over which of those you got 120Hz 4K SDR. If this Hackintosh has Ventura, I doubt either of those two truly gave you 120 Hz 4K SDR.

DP1.4 4K SDR goes only up to 95Hz on my 6900XT in Ventura. It's only in Sonoma that Apple finally (after 3+ years) fixed drivers for AMD cards thus with DSC (Display Stream Compression) over DisplayPort, the card can deliver way more bandwidth, including 144Hz 4K HDR that I have on my monitor. Sonoma drivers also enabled VRR (Variable Refresh Rate) which further enhance the picture quality, especially with lots of static text.

Meanwhile, Mx machines has had VRR and proper bandwidth support for years, which means years of fine-tuning for various displays.
 

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
Is the macOS the same on both machines?
No,

on my hackintosh i have ventura. I used a USB-C to HDMI 2.1 Adapter. The usb-c output was taken my TR ridge card. my 6800 XT is connected to the TR card via a DP 1.4 to mini DP cable. I never knew we could do this.

When i took the output from usb-c from my TR card on my ventura. i am getting 4k 120 HZ SDR 10 bit.

But even if just use just 60 hz. It still makes no difference.

I m afraid it will break something on hackintosh if i upgrade to sonoma.

I also tested windows and linux with this usb-c output and picture quality is crisp and smooth on windows and linux. Only on macos things get a little blurry.

Oh i forgot to mention. you need to update the firmware on the usb-c to hdmi 2.1 adapters from cablematters to get the 4k 120hz 10 bit RGB. Cable matters released unofficial firmware for DP 1.4 to hdmi 2.1 and usb-c to hdmi 2.1 adapters to get 4k 120hz. There is a huge thread on this on macrumours to get this done.

So be it ventura or sonoma , you can now get 4k 120hz 10 bit RGB with any recent(VM700 Chipset) DP 1.4/usb-c to hdmi 2.1 adapter.
 

atanvarno

Donator
Donator
AMD OS X Member
Joined
May 2, 2020
Messages
228
No,

on my hackintosh i have ventura. I used a USB-C to HDMI 2.1 Adapter. The usb-c output was taken my TR ridge card. my 6800 XT is connected to the TR card via a DP 1.4 to mini DP cable. I never knew we could do this.
When i took the output from usb-c from my TR card on my ventura. i am getting 4k 120 HZ SDR 10 bit.
Oh i forgot to mention. you need to update the firmware on the usb-c to hdmi 2.1 adapters from cablematters to get the 4k 120hz 10 bit RGB. Cable matters released unofficial firmware for DP 1.4 to hdmi 2.1 and usb-c to hdmi 2.1 adapters to get 4k 120hz. There is a huge thread on this on macrumours to get this done.
But even if just use just 60 hz. It still makes no difference.

That's interesting, thanks for sharing. Not applicable to my hardware but good stuff.

I m afraid it will break something on hackintosh if i upgrade to sonoma.

There is every chance that things will break somewhere. Which is why my main Hack with BCM94360NG card is still on Ventura and it unlikely to be upgraded. Another build is on Sonoma as testing ground and things are not stable, few months in.

I also tested windows and linux with this usb-c output and picture quality is crisp and smooth on windows and linux. Only on macos things get a little blurry.

I'm pretty sure this is simply the consequence of the Apple abandoning those drivers for few years and only fixing stuff for Sonoma.
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
Yes, i believe the AMD Drivers are not optimized like the M2 GPU drivers. This is evident from the fact that 6800 XT on windows gives a better or crisper image.

I guess the 6000 Series will be the last generation to be supported and that too poorly.

I wish apple would support both Nvidia and AMD GPUs, but not likely to happen in future.

My TV is the sony A95L QD-OLED.
Got the new wide screen OLED monitor and have all three systems running at 240Hz no visible differences between the Mac mini and the PCs running Sonoma.
 

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
Got the new wide screen OLED monitor and have all three systems running at 240Hz no visible differences between the Mac mini and the PCs running Sonoma.
Can you give more details like how you are connecting to the monitor(hdmi/usb-c).

what is the resolution bit depth and RGB/YUV?
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
Can you give more details like how you are connecting to the monitor(hdmi/usb-c).

what is the resolution bit depth and RGB/YUV?
I use this cable to connect the thunderbolt port to the display port connector on the monitor: UGREEN USB C to DisplayPort 1.4 Cable
If I knew more about the resolution bit depth and or RGB/YUV information I could answer your question, not exactly an area of concern for me. How can I provide you with that information?
 
Last edited:

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
I use this cable to connect the thunderbolt port to the display port connector on the monitor: UGREEN USB C to DisplayPort 1.4 Cable
If I knew more about the resolution bit depth and or RGB/YUV information I could answer your question, not exactly an area of concern for me. How can I provide you with that information?
The monitor will show the resolution, bit depth...etc.

The monitor will show all the details of the input signal it's receiving. There has to be some info button in settings.

This is how you confirm you are getting the right output.
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
The monitor will show the resolution, bit depth...etc.

The monitor will show all the details of the input signal it's receiving. There has to be some info button in settings.

This is how you confirm you are getting the right output.
I went through al the settings on the monitor and what you are asking is not available, it will show 5120 x 1440 @240hz HDR HD, but that is it besides other unrelated information about the model number and a calibration report. I can also see the display frequency in macOS under display settings but that's it.
 

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
I went through al the settings on the monitor and what you are asking is not available, it will show 5120 x 1440 @240hz HDR HD, but that is it besides other unrelated information about the model number and a calibration report. I can also see the display frequency in macOS under display settings but that's it.
That is strange. Any recent monitor will have an OSD (on screen display) feature that will show signal details that its receiving.

What is the monitor model and name?
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
That is strange. Any recent monitor will have an OSD (on screen display) feature that will show signal details that its receiving.

What is the monitor model and name?
Somehow I knew you were going to ask that. It does have an OSD and it shows every time a display signal is detected but within the monitor menu the details you ask for are not available.

49" Odyssey OLED G9 (G95SC) DQHD 240Hz 0.03ms G-Sync Compatible Curved Smart Gaming Monitor

LS49CG954SNXZA
 
Last edited:

zamfir1235

New member
AMD OS X Member
Joined
May 9, 2023
Messages
15
Somehow I knew you were going to ask that. It does have an OSD and it shows every time a display signal is detected but within the monitor menu the details you ask for are not available.

49" Odyssey OLED G9 (G95SC) DQHD 240Hz 0.03ms G-Sync Compatible Curved Smart Gaming Monitor

LS49CG954SNXZA
That is typical of samsung. Not surprised at all.

Even TVs these days display all the details of hdmi signal.

In any case, you got lucky with your display.

For me on my TV ,the difference is very clear. I guess i will have to stick with my mini. plus my hackintosh cannot run apps like gigapixel without stuttering. I cannot even install the latest version of gigapixel.

Also, no more updates to drivers going forward and even if apple does release it will not be as good as the M series optimised GPU drivers.

Hackintosh is just a storage server now.

The time is coming to move on to original macs. sad but such is reality. It was a good run. we will knew the day was coming.

Enjoy ur hackin's for whatever time is left.
 

leesurone

Donator
Donator
AMD OS X Member
Joined
May 6, 2020
Messages
313
That is typical of samsung. Not surprised at all.

Even TVs these days display all the details of hdmi signal.

In any case, you got lucky with your display.

For me on my TV ,the difference is very clear. I guess i will have to stick with my mini. plus my hackintosh cannot run apps like gigapixel without stuttering. I cannot even install the latest version of gigapixel.

Also, no more updates to drivers going forward and even if apple does release it will not be as good as the M series optimised GPU drivers.

Hackintosh is just a storage server now.

The time is coming to move on to original macs. sad but such is reality. It was a good run. we will knew the day was coming.

Enjoy ur hackin's for whatever time is left.
I’ve had a little luck sure, never doubted though this new monitor would be an issue. I had a very similar monitor prior to this that’s listed on EBay now that also worked as expected, same set up.
 
Back
Top Bottom
  AdBlock Detected
Sure, ad-blocking software does a great job at blocking ads, but it also blocks some useful and important features of our website. For the best possible site experience please take a moment to disable your AdBlocker.