drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ 1 file changed, 5 insertions(+)
XRGB8888 is the default mode that Xorg will want to use. Add support
for this to the Zynqmp DisplayPort driver, so that applications can use
32-bit framebuffers. This solves that the X server would fail to start
unless one provided an xorg.conf that sets DefaultDepth to 16.
Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl>
---
drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++
1 file changed, 5 insertions(+)
diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c
index 80d1e499a18d..501428437000 100644
--- a/drivers/gpu/drm/xlnx/zynqmp_disp.c
+++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c
@@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = {
.buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
.swap = true,
.sf = scaling_factors_888,
+ }, {
+ .drm_fmt = DRM_FORMAT_XRGB8888,
+ .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
+ .swap = true,
+ .sf = scaling_factors_888,
}, {
.drm_fmt = DRM_FORMAT_RGBA8888,
.buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888,
--
2.43.0
base-commit: 67a993863163cb88b1b68974c31b0d84ece4293e
branch: linux-master-zynqmpdp-32bit
Met vriendelijke groet / kind regards,
Mike Looijmans
System Expert
TOPIC Embedded Products B.V.
Materiaalweg 4, 5681 RJ Best
The Netherlands
T: +31 (0) 499 33 69 69
E: mike.looijmans@topic.nl
W: www.topic.nl
Please consider the environment before printing this e-mail
Hi Mike, Thank you for the patch. On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > XRGB8888 is the default mode that Xorg will want to use. Add support > for this to the Zynqmp DisplayPort driver, so that applications can use > 32-bit framebuffers. This solves that the X server would fail to start > unless one provided an xorg.conf that sets DefaultDepth to 16. > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > --- > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > 1 file changed, 5 insertions(+) > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > index 80d1e499a18d..501428437000 100644 > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > .swap = true, > .sf = scaling_factors_888, > + }, { > + .drm_fmt = DRM_FORMAT_XRGB8888, > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > + .swap = true, > + .sf = scaling_factors_888, I'm afraid that's not enough. There's a crucial difference between DRM_FORMAT_ARGB8888 (already supported by this driver) and DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. The graphics layer is blended on top of the video layer, and the blender uses both a global alpha parameter and the alpha channel of the graphics layer for 32-bit RGB formats. This will lead to incorrect operation when the 'X' component is not set to full opacity. > }, { > .drm_fmt = DRM_FORMAT_RGBA8888, > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888, -- Regards, Laurent Pinchart
On 27-06-2025 20:19, Laurent Pinchart wrote: > Hi Mike, > > Thank you for the patch. > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: >> XRGB8888 is the default mode that Xorg will want to use. Add support >> for this to the Zynqmp DisplayPort driver, so that applications can use >> 32-bit framebuffers. This solves that the X server would fail to start >> unless one provided an xorg.conf that sets DefaultDepth to 16. >> >> Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> >> --- >> >> drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ >> 1 file changed, 5 insertions(+) >> >> diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c >> index 80d1e499a18d..501428437000 100644 >> --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c >> +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c >> @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { >> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >> .swap = true, >> .sf = scaling_factors_888, >> + }, { >> + .drm_fmt = DRM_FORMAT_XRGB8888, >> + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >> + .swap = true, >> + .sf = scaling_factors_888, > I'm afraid that's not enough. There's a crucial difference between > DRM_FORMAT_ARGB8888 (already supported by this driver) and > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > The graphics layer is blended on top of the video layer, and the blender > uses both a global alpha parameter and the alpha channel of the graphics > layer for 32-bit RGB formats. This will lead to incorrect operation when > the 'X' component is not set to full opacity. I spent a few hours digging in the source code and what I could find in the TRM and register maps, but there's not enough information in there to explain how the blender works. The obvious "XRGB" implementation would be to just disable the blender. What I got from experimenting so far is that the alpha component is ignored anyway while the video path isn't active. So as long as one isn't using the video blending path, the ARGB and XRGB modes are identical. Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB modes. (For our application, this patch is sufficient as it solves the issues like X11 not starting up, OpenGL not working and horrendously slow scaling performance) > >> }, { >> .drm_fmt = DRM_FORMAT_RGBA8888, >> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888, -- Mike Looijmans System Expert TOPIC Embedded Products B.V. Materiaalweg 4, 5681 RJ Best The Netherlands T: +31 (0) 499 33 69 69 E: mike.looijmans@topic.nl W: www.topic.nl
On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > On 27-06-2025 20:19, Laurent Pinchart wrote: > > Hi Mike, > > > > Thank you for the patch. > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > 32-bit framebuffers. This solves that the X server would fail to start > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > --- > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > 1 file changed, 5 insertions(+) > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > index 80d1e499a18d..501428437000 100644 > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > .swap = true, > > > .sf = scaling_factors_888, > > > + }, { > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > + .swap = true, > > > + .sf = scaling_factors_888, > > I'm afraid that's not enough. There's a crucial difference between > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > The graphics layer is blended on top of the video layer, and the blender > > uses both a global alpha parameter and the alpha channel of the graphics > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > the 'X' component is not set to full opacity. > > I spent a few hours digging in the source code and what I could find in the > TRM and register maps, but there's not enough information in there to > explain how the blender works. The obvious "XRGB" implementation would be to > just disable the blender. > > What I got from experimenting so far is that the alpha component is ignored > anyway while the video path isn't active. So as long as one isn't using the > video blending path, the ARGB and XRGB modes are identical. > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > modes. > > (For our application, this patch is sufficient as it solves the issues like > X11 not starting up, OpenGL not working and horrendously slow scaling > performance) Given that we consider XRGB8888 mandatory, this patch is a good thing to have anyway, even if suboptimal, or broken in some scenario we can always fix later. Maxime
On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > --- > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > 1 file changed, 5 insertions(+) > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > index 80d1e499a18d..501428437000 100644 > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > .swap = true, > > > > .sf = scaling_factors_888, > > > > + }, { > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > + .swap = true, > > > > + .sf = scaling_factors_888, > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > The graphics layer is blended on top of the video layer, and the blender > > > uses both a global alpha parameter and the alpha channel of the graphics > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > the 'X' component is not set to full opacity. > > > > I spent a few hours digging in the source code and what I could find in the > > TRM and register maps, but there's not enough information in there to > > explain how the blender works. The obvious "XRGB" implementation would be to > > just disable the blender. > > > > What I got from experimenting so far is that the alpha component is ignored > > anyway while the video path isn't active. So as long as one isn't using the > > video blending path, the ARGB and XRGB modes are identical. > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > modes. > > > > (For our application, this patch is sufficient as it solves the issues like > > X11 not starting up, OpenGL not working and horrendously slow scaling > > performance) > > Given that we consider XRGB8888 mandatory, How about platforms that can't support it at all ? > this patch is a good thing to > have anyway, even if suboptimal, or broken in some scenario we can > always fix later. It needs to at least be updated to disallow XRGB8888 usage when the video plan is enabled, or when global alpha is set to a non-opaque value. -- Regards, Laurent Pinchart
On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > > --- > > > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > > 1 file changed, 5 insertions(+) > > > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > index 80d1e499a18d..501428437000 100644 > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > .swap = true, > > > > > .sf = scaling_factors_888, > > > > > + }, { > > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > + .swap = true, > > > > > + .sf = scaling_factors_888, > > > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > > The graphics layer is blended on top of the video layer, and the blender > > > > uses both a global alpha parameter and the alpha channel of the graphics > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > > the 'X' component is not set to full opacity. > > > > > > I spent a few hours digging in the source code and what I could find in the > > > TRM and register maps, but there's not enough information in there to > > > explain how the blender works. The obvious "XRGB" implementation would be to > > > just disable the blender. > > > > > > What I got from experimenting so far is that the alpha component is ignored > > > anyway while the video path isn't active. So as long as one isn't using the > > > video blending path, the ARGB and XRGB modes are identical. > > > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > > modes. > > > > > > (For our application, this patch is sufficient as it solves the issues like > > > X11 not starting up, OpenGL not working and horrendously slow scaling > > > performance) > > > > Given that we consider XRGB8888 mandatory, > > How about platforms that can't support it at all ? We emulate it. > > this patch is a good thing to > > have anyway, even if suboptimal, or broken in some scenario we can > > always fix later. > > It needs to at least be updated to disallow XRGB8888 usage when the > video plan is enabled, or when global alpha is set to a non-opaque > value. Yeah, that's reasonable Maxime
On 30-06-2025 11:29, Maxime Ripard wrote: > On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: >> On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: >>> On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: >>>> On 27-06-2025 20:19, Laurent Pinchart wrote: >>>>> On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: >>>>>> XRGB8888 is the default mode that Xorg will want to use. Add support >>>>>> for this to the Zynqmp DisplayPort driver, so that applications can use >>>>>> 32-bit framebuffers. This solves that the X server would fail to start >>>>>> unless one provided an xorg.conf that sets DefaultDepth to 16. >>>>>> >>>>>> Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> >>>>>> --- >>>>>> >>>>>> drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ >>>>>> 1 file changed, 5 insertions(+) >>>>>> >>>>>> diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>>>> index 80d1e499a18d..501428437000 100644 >>>>>> --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>>>> +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>>>> @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { >>>>>> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >>>>>> .swap = true, >>>>>> .sf = scaling_factors_888, >>>>>> + }, { >>>>>> + .drm_fmt = DRM_FORMAT_XRGB8888, >>>>>> + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >>>>>> + .swap = true, >>>>>> + .sf = scaling_factors_888, >>>>> I'm afraid that's not enough. There's a crucial difference between >>>>> DRM_FORMAT_ARGB8888 (already supported by this driver) and >>>>> DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. >>>>> The graphics layer is blended on top of the video layer, and the blender >>>>> uses both a global alpha parameter and the alpha channel of the graphics >>>>> layer for 32-bit RGB formats. This will lead to incorrect operation when >>>>> the 'X' component is not set to full opacity. >>>> I spent a few hours digging in the source code and what I could find in the >>>> TRM and register maps, but there's not enough information in there to >>>> explain how the blender works. The obvious "XRGB" implementation would be to >>>> just disable the blender. >>>> >>>> What I got from experimenting so far is that the alpha component is ignored >>>> anyway while the video path isn't active. So as long as one isn't using the >>>> video blending path, the ARGB and XRGB modes are identical. >>>> >>>> Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB >>>> modes. >>>> >>>> (For our application, this patch is sufficient as it solves the issues like >>>> X11 not starting up, OpenGL not working and horrendously slow scaling >>>> performance) >>> Given that we consider XRGB8888 mandatory, >> How about platforms that can't support it at all ? > We emulate it. > >>> this patch is a good thing to >>> have anyway, even if suboptimal, or broken in some scenario we can >>> always fix later. >> It needs to at least be updated to disallow XRGB8888 usage when the >> video plan is enabled, or when global alpha is set to a non-opaque >> value. > Yeah, that's reasonable And feasible too I think. Basically only allow XRGB8888 when things are either totally transparent or totally opaque. I'm only concerned it might to lead to strange behavior, depending on which layer you enable first. > > Maxime -- Mike Looijmans
On Mon, Jun 30, 2025 at 11:29:08AM +0200, Maxime Ripard wrote: > On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: > > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > > > --- > > > > > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > > > 1 file changed, 5 insertions(+) > > > > > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > index 80d1e499a18d..501428437000 100644 > > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > .swap = true, > > > > > > .sf = scaling_factors_888, > > > > > > + }, { > > > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > + .swap = true, > > > > > > + .sf = scaling_factors_888, > > > > > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > > > The graphics layer is blended on top of the video layer, and the blender > > > > > uses both a global alpha parameter and the alpha channel of the graphics > > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > > > the 'X' component is not set to full opacity. > > > > > > > > I spent a few hours digging in the source code and what I could find in the > > > > TRM and register maps, but there's not enough information in there to > > > > explain how the blender works. The obvious "XRGB" implementation would be to > > > > just disable the blender. > > > > > > > > What I got from experimenting so far is that the alpha component is ignored > > > > anyway while the video path isn't active. So as long as one isn't using the > > > > video blending path, the ARGB and XRGB modes are identical. > > > > > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > > > modes. > > > > > > > > (For our application, this patch is sufficient as it solves the issues like > > > > X11 not starting up, OpenGL not working and horrendously slow scaling > > > > performance) > > > > > > Given that we consider XRGB8888 mandatory, > > > > How about platforms that can't support it at all ? > > We emulate it. Does that imply a full memcpy of the frame buffer in the kernel driver, or is it emulated in userspace ? > > > this patch is a good thing to > > > have anyway, even if suboptimal, or broken in some scenario we can > > > always fix later. > > > > It needs to at least be updated to disallow XRGB8888 usage when the > > video plan is enabled, or when global alpha is set to a non-opaque > > value. > > Yeah, that's reasonable -- Regards, Laurent Pinchart
On Mon, Jun 30, 2025 at 12:33:35PM +0300, Laurent Pinchart wrote: > On Mon, Jun 30, 2025 at 11:29:08AM +0200, Maxime Ripard wrote: > > On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: > > > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > > > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > > > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > > > > --- > > > > > > > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > > > > 1 file changed, 5 insertions(+) > > > > > > > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > index 80d1e499a18d..501428437000 100644 > > > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > .swap = true, > > > > > > > .sf = scaling_factors_888, > > > > > > > + }, { > > > > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > + .swap = true, > > > > > > > + .sf = scaling_factors_888, > > > > > > > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > > > > The graphics layer is blended on top of the video layer, and the blender > > > > > > uses both a global alpha parameter and the alpha channel of the graphics > > > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > > > > the 'X' component is not set to full opacity. > > > > > > > > > > I spent a few hours digging in the source code and what I could find in the > > > > > TRM and register maps, but there's not enough information in there to > > > > > explain how the blender works. The obvious "XRGB" implementation would be to > > > > > just disable the blender. > > > > > > > > > > What I got from experimenting so far is that the alpha component is ignored > > > > > anyway while the video path isn't active. So as long as one isn't using the > > > > > video blending path, the ARGB and XRGB modes are identical. > > > > > > > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > > > > modes. > > > > > > > > > > (For our application, this patch is sufficient as it solves the issues like > > > > > X11 not starting up, OpenGL not working and horrendously slow scaling > > > > > performance) > > > > > > > > Given that we consider XRGB8888 mandatory, > > > > > > How about platforms that can't support it at all ? > > > > We emulate it. > > Does that imply a full memcpy of the frame buffer in the kernel driver, > or is it emulated in userspace ? Neither :) The kernel deals with it through drm_fb_xrgb8888_to_* helpers, but only on the parts of the framebuffer that were modified through the damage API. Maxime
On Mon, Jun 30, 2025 at 12:52:48PM +0200, Maxime Ripard wrote: > On Mon, Jun 30, 2025 at 12:33:35PM +0300, Laurent Pinchart wrote: > > On Mon, Jun 30, 2025 at 11:29:08AM +0200, Maxime Ripard wrote: > > > On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: > > > > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > > > > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > > > > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > > > > > --- > > > > > > > > > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > > > > > 1 file changed, 5 insertions(+) > > > > > > > > > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > index 80d1e499a18d..501428437000 100644 > > > > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > > .swap = true, > > > > > > > > .sf = scaling_factors_888, > > > > > > > > + }, { > > > > > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > > + .swap = true, > > > > > > > > + .sf = scaling_factors_888, > > > > > > > > > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > > > > > The graphics layer is blended on top of the video layer, and the blender > > > > > > > uses both a global alpha parameter and the alpha channel of the graphics > > > > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > > > > > the 'X' component is not set to full opacity. > > > > > > > > > > > > I spent a few hours digging in the source code and what I could find in the > > > > > > TRM and register maps, but there's not enough information in there to > > > > > > explain how the blender works. The obvious "XRGB" implementation would be to > > > > > > just disable the blender. > > > > > > > > > > > > What I got from experimenting so far is that the alpha component is ignored > > > > > > anyway while the video path isn't active. So as long as one isn't using the > > > > > > video blending path, the ARGB and XRGB modes are identical. > > > > > > > > > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > > > > > modes. > > > > > > > > > > > > (For our application, this patch is sufficient as it solves the issues like > > > > > > X11 not starting up, OpenGL not working and horrendously slow scaling > > > > > > performance) > > > > > > > > > > Given that we consider XRGB8888 mandatory, > > > > > > > > How about platforms that can't support it at all ? > > > > > > We emulate it. > > > > Does that imply a full memcpy of the frame buffer in the kernel driver, > > or is it emulated in userspace ? > > Neither :) > > The kernel deals with it through drm_fb_xrgb8888_to_* helpers, but only > on the parts of the framebuffer that were modified through the damage > API. Aahhh OK, it's for the fbdev emulation. So that means that drivers are not required to support XRGB8888 ? -- Regards, Laurent Pinchart
On Mon, Jun 30, 2025 at 02:30:08PM +0300, Laurent Pinchart wrote: > On Mon, Jun 30, 2025 at 12:52:48PM +0200, Maxime Ripard wrote: > > On Mon, Jun 30, 2025 at 12:33:35PM +0300, Laurent Pinchart wrote: > > > On Mon, Jun 30, 2025 at 11:29:08AM +0200, Maxime Ripard wrote: > > > > On Mon, Jun 30, 2025 at 12:11:56PM +0300, Laurent Pinchart wrote: > > > > > On Mon, Jun 30, 2025 at 10:27:55AM +0200, Maxime Ripard wrote: > > > > > > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > > > > > > > On 27-06-2025 20:19, Laurent Pinchart wrote: > > > > > > > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > > > > > > > > > XRGB8888 is the default mode that Xorg will want to use. Add support > > > > > > > > > for this to the Zynqmp DisplayPort driver, so that applications can use > > > > > > > > > 32-bit framebuffers. This solves that the X server would fail to start > > > > > > > > > unless one provided an xorg.conf that sets DefaultDepth to 16. > > > > > > > > > > > > > > > > > > Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > > > > > > > > > --- > > > > > > > > > > > > > > > > > > drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > > > > > > > > > 1 file changed, 5 insertions(+) > > > > > > > > > > > > > > > > > > diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > > index 80d1e499a18d..501428437000 100644 > > > > > > > > > --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > > +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > > > > > > > > > @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > > > > > > > > > .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > > > .swap = true, > > > > > > > > > .sf = scaling_factors_888, > > > > > > > > > + }, { > > > > > > > > > + .drm_fmt = DRM_FORMAT_XRGB8888, > > > > > > > > > + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > > > > > > > > > + .swap = true, > > > > > > > > > + .sf = scaling_factors_888, > > > > > > > > > > > > > > > > I'm afraid that's not enough. There's a crucial difference between > > > > > > > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > > > > > > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > > > > > > > The graphics layer is blended on top of the video layer, and the blender > > > > > > > > uses both a global alpha parameter and the alpha channel of the graphics > > > > > > > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > > > > > > > the 'X' component is not set to full opacity. > > > > > > > > > > > > > > I spent a few hours digging in the source code and what I could find in the > > > > > > > TRM and register maps, but there's not enough information in there to > > > > > > > explain how the blender works. The obvious "XRGB" implementation would be to > > > > > > > just disable the blender. > > > > > > > > > > > > > > What I got from experimenting so far is that the alpha component is ignored > > > > > > > anyway while the video path isn't active. So as long as one isn't using the > > > > > > > video blending path, the ARGB and XRGB modes are identical. > > > > > > > > > > > > > > Guess I'll need assistance from AMD/Xilinx to completely implement the XRGB > > > > > > > modes. > > > > > > > > > > > > > > (For our application, this patch is sufficient as it solves the issues like > > > > > > > X11 not starting up, OpenGL not working and horrendously slow scaling > > > > > > > performance) > > > > > > > > > > > > Given that we consider XRGB8888 mandatory, > > > > > > > > > > How about platforms that can't support it at all ? > > > > > > > > We emulate it. > > > > > > Does that imply a full memcpy of the frame buffer in the kernel driver, > > > or is it emulated in userspace ? > > > > Neither :) > > > > The kernel deals with it through drm_fb_xrgb8888_to_* helpers, but only > > on the parts of the framebuffer that were modified through the damage > > API. > > Aahhh OK, it's for the fbdev emulation. So that means that drivers are > not required to support XRGB8888 ? No, it's for KMS. Maxime
On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: > On 27-06-2025 20:19, Laurent Pinchart wrote: > > On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: > >> XRGB8888 is the default mode that Xorg will want to use. Add support > >> for this to the Zynqmp DisplayPort driver, so that applications can use > >> 32-bit framebuffers. This solves that the X server would fail to start > >> unless one provided an xorg.conf that sets DefaultDepth to 16. > >> > >> Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> > >> --- > >> > >> drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ > >> 1 file changed, 5 insertions(+) > >> > >> diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c > >> index 80d1e499a18d..501428437000 100644 > >> --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c > >> +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c > >> @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { > >> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > >> .swap = true, > >> .sf = scaling_factors_888, > >> + }, { > >> + .drm_fmt = DRM_FORMAT_XRGB8888, > >> + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, > >> + .swap = true, > >> + .sf = scaling_factors_888, > > > > I'm afraid that's not enough. There's a crucial difference between > > DRM_FORMAT_ARGB8888 (already supported by this driver) and > > DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. > > The graphics layer is blended on top of the video layer, and the blender > > uses both a global alpha parameter and the alpha channel of the graphics > > layer for 32-bit RGB formats. This will lead to incorrect operation when > > the 'X' component is not set to full opacity. > > I spent a few hours digging in the source code and what I could find in > the TRM and register maps, but there's not enough information in there > to explain how the blender works. The obvious "XRGB" implementation > would be to just disable the blender. That won't work when using global alpha unfortunately :-( > What I got from experimenting so far is that the alpha component is > ignored anyway while the video path isn't active. So as long as one > isn't using the video blending path, the ARGB and XRGB modes are identical. Correct, *if* global alpha is set to full opaque, then you can disable the blender. That could confuse userspace though, enabling the graphics plane with XRGB would work, and then enabling the video plane with global alpha would fail. > Guess I'll need assistance from AMD/Xilinx to completely implement the > XRGB modes. The blender can ignore the alpha channel of the graphics plane for formats that have no alpha channel. It would be nice if there was a bit to force that behaviour for 32-bit RGB too, but I couldn't find any :-( It's worth asking though. > (For our application, this patch is sufficient as it solves the issues > like X11 not starting up, OpenGL not working and horrendously slow > scaling performance) > > >> }, { > >> .drm_fmt = DRM_FORMAT_RGBA8888, > >> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888, -- Regards, Laurent Pinchart
On 30-06-2025 10:21, Laurent Pinchart wrote: > On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote: >> On 27-06-2025 20:19, Laurent Pinchart wrote: >>> On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote: >>>> XRGB8888 is the default mode that Xorg will want to use. Add support >>>> for this to the Zynqmp DisplayPort driver, so that applications can use >>>> 32-bit framebuffers. This solves that the X server would fail to start >>>> unless one provided an xorg.conf that sets DefaultDepth to 16. >>>> >>>> Signed-off-by: Mike Looijmans <mike.looijmans@topic.nl> >>>> --- >>>> >>>> drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++ >>>> 1 file changed, 5 insertions(+) >>>> >>>> diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c b/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>> index 80d1e499a18d..501428437000 100644 >>>> --- a/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>> +++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c >>>> @@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = { >>>> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >>>> .swap = true, >>>> .sf = scaling_factors_888, >>>> + }, { >>>> + .drm_fmt = DRM_FORMAT_XRGB8888, >>>> + .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888, >>>> + .swap = true, >>>> + .sf = scaling_factors_888, >>> I'm afraid that's not enough. There's a crucial difference between >>> DRM_FORMAT_ARGB8888 (already supported by this driver) and >>> DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored. >>> The graphics layer is blended on top of the video layer, and the blender >>> uses both a global alpha parameter and the alpha channel of the graphics >>> layer for 32-bit RGB formats. This will lead to incorrect operation when >>> the 'X' component is not set to full opacity. >> I spent a few hours digging in the source code and what I could find in >> the TRM and register maps, but there's not enough information in there >> to explain how the blender works. The obvious "XRGB" implementation >> would be to just disable the blender. > That won't work when using global alpha unfortunately :-( > >> What I got from experimenting so far is that the alpha component is >> ignored anyway while the video path isn't active. So as long as one >> isn't using the video blending path, the ARGB and XRGB modes are identical. > Correct, *if* global alpha is set to full opaque, then you can disable > the blender. That could confuse userspace though, enabling the graphics > plane with XRGB would work, and then enabling the video plane with > global alpha would fail. > >> Guess I'll need assistance from AMD/Xilinx to completely implement the >> XRGB modes. > The blender can ignore the alpha channel of the graphics plane for > formats that have no alpha channel. It would be nice if there was a bit > to force that behaviour for 32-bit RGB too, but I couldn't find any :-( > It's worth asking though. Yes, my problem exactly. > >> (For our application, this patch is sufficient as it solves the issues >> like X11 not starting up, OpenGL not working and horrendously slow >> scaling performance) >> >>>> }, { >>>> .drm_fmt = DRM_FORMAT_RGBA8888, >>>> .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888, -- Mike Looijmans System Expert TOPIC Embedded Products B.V. Materiaalweg 4, 5681 RJ Best The Netherlands T: +31 (0) 499 33 69 69 E: mike.looijmans@topic.nl W: www.topic.nl
© 2016 - 2025 Red Hat, Inc.