[PATCH] drm/ssd130x: Change pixel format used to compute the buffer size

Javier Martinez Canillas posted 1 patch 2 years, 7 months ago
drivers/gpu/drm/solomon/ssd130x.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
[PATCH] drm/ssd130x: Change pixel format used to compute the buffer size
Posted by Javier Martinez Canillas 2 years, 7 months ago
The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
to calculate the size of the buffer allocated to store the native pixels.

But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
color-indexed frame buffer formats. While the ssd103x controllers don't
support different single-channel colors nor a Color Lookup Table (CLUT).

Both formats use eight pixels/byte, so in practice there is no functional
changes in this patch. But still the correct pixel format should be used.

Suggested-by: Geert Uytterhoeven <geert@linux-m68k.org>
Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>
---

 drivers/gpu/drm/solomon/ssd130x.c | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/drivers/gpu/drm/solomon/ssd130x.c b/drivers/gpu/drm/solomon/ssd130x.c
index b3dc1ca9dc10..afb08a8aa9fc 100644
--- a/drivers/gpu/drm/solomon/ssd130x.c
+++ b/drivers/gpu/drm/solomon/ssd130x.c
@@ -153,7 +153,7 @@ static int ssd130x_buf_alloc(struct ssd130x_device *ssd130x)
 	const struct drm_format_info *fi;
 	unsigned int pitch;
 
-	fi = drm_format_info(DRM_FORMAT_C1);
+	fi = drm_format_info(DRM_FORMAT_R1);
 	if (!fi)
 		return -EINVAL;
 
-- 
2.41.0
Re: [PATCH] drm/ssd130x: Change pixel format used to compute the buffer size
Posted by Geert Uytterhoeven 2 years, 7 months ago
On Thu, Jul 13, 2023 at 10:59 AM Javier Martinez Canillas
<javierm@redhat.com> wrote:
> The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
> ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
> to calculate the size of the buffer allocated to store the native pixels.
>
> But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
> color-indexed frame buffer formats. While the ssd103x controllers don't
> support different single-channel colors nor a Color Lookup Table (CLUT).
>
> Both formats use eight pixels/byte, so in practice there is no functional
> changes in this patch. But still the correct pixel format should be used.
>
> Suggested-by: Geert Uytterhoeven <geert@linux-m68k.org>
> Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>

Reviewed-by: Geert Uytterhoeven <geert@linux-m68k.org>

Gr{oetje,eeting}s,

                        Geert

-- 
Geert Uytterhoeven -- There's lots of Linux beyond ia32 -- geert@linux-m68k.org

In personal conversations with technical people, I call myself a hacker. But
when I'm talking to journalists I just say "programmer" or something like that.
                                -- Linus Torvalds
Re: [PATCH] drm/ssd130x: Change pixel format used to compute the buffer size
Posted by Thomas Zimmermann 2 years, 7 months ago

Am 13.07.23 um 10:58 schrieb Javier Martinez Canillas:
> The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
> ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
> to calculate the size of the buffer allocated to store the native pixels.
> 
> But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
> color-indexed frame buffer formats. While the ssd103x controllers don't
> support different single-channel colors nor a Color Lookup Table (CLUT).

Makes sense to me.

Reviewed-by: Thomas Zimmermann <tzimmermann@suse.de>

> 
> Both formats use eight pixels/byte, so in practice there is no functional
> changes in this patch. But still the correct pixel format should be used.
> 
> Suggested-by: Geert Uytterhoeven <geert@linux-m68k.org>
> Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>
> ---
> 
>   drivers/gpu/drm/solomon/ssd130x.c | 2 +-
>   1 file changed, 1 insertion(+), 1 deletion(-)
> 
> diff --git a/drivers/gpu/drm/solomon/ssd130x.c b/drivers/gpu/drm/solomon/ssd130x.c
> index b3dc1ca9dc10..afb08a8aa9fc 100644
> --- a/drivers/gpu/drm/solomon/ssd130x.c
> +++ b/drivers/gpu/drm/solomon/ssd130x.c
> @@ -153,7 +153,7 @@ static int ssd130x_buf_alloc(struct ssd130x_device *ssd130x)
>   	const struct drm_format_info *fi;
>   	unsigned int pitch;
>   
> -	fi = drm_format_info(DRM_FORMAT_C1);
> +	fi = drm_format_info(DRM_FORMAT_R1);
>   	if (!fi)
>   		return -EINVAL;
>   

-- 
Thomas Zimmermann
Graphics Driver Developer
SUSE Software Solutions Germany GmbH
Frankenstrasse 146, 90461 Nuernberg, Germany
GF: Ivo Totev, Andrew Myers, Andrew McDonald, Boudien Moerman
HRB 36809 (AG Nuernberg)
Re: [PATCH] drm/ssd130x: Change pixel format used to compute the buffer size
Posted by Javier Martinez Canillas 2 years, 7 months ago
Thomas Zimmermann <tzimmermann@suse.de> writes:

> Am 13.07.23 um 10:58 schrieb Javier Martinez Canillas:
>> The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
>> ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
>> to calculate the size of the buffer allocated to store the native pixels.
>> 
>> But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
>> color-indexed frame buffer formats. While the ssd103x controllers don't
>> support different single-channel colors nor a Color Lookup Table (CLUT).
>
> Makes sense to me.
>
> Reviewed-by: Thomas Zimmermann <tzimmermann@suse.de>
>

Thanks Geert and Thomas for your review. I've fixed some typos that had in
my commit message and pushed this to drm-misc-next.

-- 
Best regards,

Javier Martinez Canillas
Core Platforms
Red Hat