Hello everybody,
I’m using a 24bit single channel LVDS display @ 1920x720 resolution with a Toradex TK1.
I configured timings and other parameter in the device tree and the display shows the desktop after startup.
Resolution seems fine but it has a strange issue with colors. I’m not sure if it is a display timings problem or a software configuration.
This is what I see:
[upload|ogX36vZIZR0ufLmAcnbBZ7Q7rcs=]
The image is not crisp and the bottom bar seems mixed or low color. The mouse arrow is fine in the entire dislpay so I assume that is not a timing problem but I’m ready to change idea.
Another strange thing is that before LXDE starts there is the touch calibration screen and that seems to be clear and crisp.
Here is a detail with the mouse over the menu bar:
[upload|BHXI7avZWcR/PB4x+i+lwL0xTBI=]
Hello,
thanks for your reply. One more detail that maybe can help: if I stop X server, fbset says that framebuffer is working at 32bpp depth, while it works at 16bpp if X server is running.
Here are replies to your questions:
Software version: 2.8b6
Vidargs are empty
The display is a dual channel LVDS connected to the TK1 through a signle channel LVDS to dual channel converter board. It works in DE only mode and these are the timings:
[upload|aHo99lNnOUPAkmDGsa+idX7Qdn0=]
The converter board has a LVDS deserializer DS90CF386 and a LVDS serializer DS90C387
This is how test pictures look like on the screen:
[upload|KTYo8OyTNo6wU/huG5UE7COP0rQ=]
[upload|QVqdJbCLnvb+QJIJeHssdVwZLUE=]
Thank you very much,
Best regards
I also tried TEGRA_DC_LVDS_24_1 and the result was a black desktop wallpaper color and the green toradex logo was red. So I assumed that TEGRA_DC_LVDS_24_0 is correct.
The 85MHz clock frequency is the maximum of the LVDS deserializer present on the conversion board. The display is dual channel LVDS but the TK1 is single channel, so I almost doubled the clock to keep the same bandwitdth, is this incorrect? I assumed that the clock is divided when the LVDS signal is converted from single channel to dual channel.
I don’t want to use 16BPP, I asked if Xorg configuring 16BPP on the framebuffer is causing the problem. I tried to make Xorg working at 32BPP without success, the framebuffer gets configured at 16BPP whenever X starts. Does configuring nvidia,fb-bpp to 16 make sense?
Your understanding about the pixel clock frequency is definitely wrong. You will need to specify the exact pixel clock frequency of your specific display regardless of through how many LVDS de-/serializers it goes as each one of them has its very own PLL upsampling itself to whatever is required!
Is this valid even if a single channel LVDS is converted to a dual channel LVDS?
I mean: if all 24 pixel data is sent in a single LVDS clock cycle (correct?) from the TK1 and then deserialized by the DS90CF386, then half deserialized data is sent to odd LVDS channel, and the other half to even LVDS channel of the DS90C387 serializer. If this is correct, each channel of the dual LVDS channels, will transmit half the data respect to the TK1 output.
If the DS90C387 serializer does not buffer any data, is correct thinking that its LVDS output are working at half the clock of the TK1 single channel LVDS clock in order to transmit the same data at the same time?
In any case I will try to lower the clock in the device tree and I will let you know.
Hello @marcel.tx and @jaski.tx ,
I found the issue, I reply you for feedback, it can be useful for someone else.
The single channel to dual channel converter was swapping high half byte with the low half byte (bit order :45670123), so I updated the sor.c file setting ROTDAT register to 4 instead of 0 or 6 ( TEGRA_DC_LVDS_24_0/TEGRA_DC_LVDS_24_1).