Issue with TK1 and LVDS display color

Hello everybody,
I’m using a 24bit single channel LVDS display @ 1920x720 resolution with a Toradex TK1.

I configured timings and other parameter in the device tree and the display shows the desktop after startup.
Resolution seems fine but it has a strange issue with colors. I’m not sure if it is a display timings problem or a software configuration.
This is what I see:

The image is not crisp and the bottom bar seems mixed or low color. The mouse arrow is fine in the entire dislpay so I assume that is not a timing problem but I’m ready to change idea.

Another strange thing is that before LXDE starts there is the touch calibration screen and that seems to be clear and crisp.

Here is a detail with the mouse over the menu bar:

Any suggestion?

edit: this is an extract of the device tree:

	lvds:lvds {
		status = "okay";
		display {
			status = "okay";
			lvds-drive-strength = <0x40 0x40 0x40 0x40 0x40>;
			disp-default-out {
				status = "okay";
				nvidia,out-type = <TEGRA_DC_OUT_LVDS>;
				nvidia,out-flags = <TEGRA_DC_OUT_CONTINUOUS_MODE>;
				nvidia,out-parent-clk = "pll_d_out0";
				nvidia,out-max-pixclk = <3367>; /* KHZ2PICOS(297000) */
				nvidia,out-align = <TEGRA_DC_ALIGN_MSB>;
				nvidia,out-order = <TEGRA_DC_ORDER_RED_BLUE>;
				nvidia,out-depth = <24>;
				nvidia,out-lvds-mode = <TEGRA_DC_LVDS_24_0>;
				nvidia,out-xres = <1920>;
				nvidia,out-yres = <720>;
			display-timings {
				timing_1920_720: 1920x720 {
					clock-frequency = <85000000>;
					nvidia,h-ref-to-sync = <1>;
					nvidia,v-ref-to-sync = <1>;
					hsync-len = <32>;
					vsync-len = <10>;
					hback-porch = <64>;
					vback-porch = <20>;
					hactive = <1920>;
					vactive = <720>;
					hfront-porch = <32>;
					vfront-porch = <10>;

	dc@54200000 {
			status = "okay";
			nvidia,dc-connection = <&lvds>;
			nvidia,dc-flags = <TEGRA_DC_FLAG_ENABLED>;
			nvidia,emc-clk-rate = <300000000>;
			nvidia,fb-bpp = <32>; /* bits per pixel */
			nvidia,fb-flags = <TEGRA_FB_FLIP_ON_PROBE>;
			avdd-supply = <&as3722_ldo4>;


Here another detail of the screen:

Hi @txrx

Could you provide the software version of your module?

Did you clear the vidargs as explained here?

Please share the details about the display you are using?

Could you show the following pictures on your display and share how they look like?

Thanks and best regards,

thanks for your reply. One more detail that maybe can help: if I stop X server, fbset says that framebuffer is working at 32bpp depth, while it works at 16bpp if X server is running.
Here are replies to your questions:

  • Software version: 2.8b6
  • Vidargs are empty
  • The display is a dual channel LVDS connected to the TK1 through a signle channel LVDS to dual channel converter board. It works in DE only mode and these are the timings:
  • The converter board has a LVDS deserializer DS90CF386 and a LVDS serializer DS90C387
  • This is how test pictures look like on the screen:
    Thank you very much,
    Best regards


And you are absolutely sure nvidia,out-lvds-mode being TEGRA_DC_LVDS_24_0 is proper?

And why exactly would you use a clock-frequency of 85 MHz when your display says it needs one between 44.6 and 52 MHz?

Why is it that none of your display timings are actually depicted in the device tree?

If you truly want to use 16 BPP why exactly would you then set nvidia,fb-bpp to 32?

I assume you had a look at the following article on our developer website already:

Hi Marcel,
thanks for reply. Here are my answers:

I also tried TEGRA_DC_LVDS_24_1 and the result was a black desktop wallpaper color and the green toradex logo was red. So I assumed that TEGRA_DC_LVDS_24_0 is correct.

The 85MHz clock frequency is the maximum of the LVDS deserializer present on the conversion board. The display is dual channel LVDS but the TK1 is single channel, so I almost doubled the clock to keep the same bandwitdth, is this incorrect? I assumed that the clock is divided when the LVDS signal is converted from single channel to dual channel.

For my timing setup I followed Jaski answer in the linked topic, because the display works in DE mode only:

I don’t want to use 16BPP, I asked if Xorg configuring 16BPP on the framebuffer is causing the problem. I tried to make Xorg working at 32BPP without success, the framebuffer gets configured at 16BPP whenever X starts. Does configuring nvidia,fb-bpp to 16 make sense?

Your understanding about the pixel clock frequency is definitely wrong. You will need to specify the exact pixel clock frequency of your specific display regardless of through how many LVDS de-/serializers it goes as each one of them has its very own PLL upsampling itself to whatever is required!

Is this valid even if a single channel LVDS is converted to a dual channel LVDS?

I mean: if all 24 pixel data is sent in a single LVDS clock cycle (correct?) from the TK1 and then deserialized by the DS90CF386, then half deserialized data is sent to odd LVDS channel, and the other half to even LVDS channel of the DS90C387 serializer. If this is correct, each channel of the dual LVDS channels, will transmit half the data respect to the TK1 output.

If the DS90C387 serializer does not buffer any data, is correct thinking that its LVDS output are working at half the clock of the TK1 single channel LVDS clock in order to transmit the same data at the same time?

In any case I will try to lower the clock in the device tree and I will let you know.


I just tried with 45MHz clock but the result is the same.


Did you try to use 16 Bpp? Does this help?

Hello @marcel.tx and @jaski.tx ,
I found the issue, I reply you for feedback, it can be useful for someone else.

The single channel to dual channel converter was swapping high half byte with the low half byte (bit order :45670123), so I updated the sor.c file setting ROTDAT register to 4 instead of 0 or 6 ( TEGRA_DC_LVDS_24_0/TEGRA_DC_LVDS_24_1).

Thanks for your support,


Perfect that it works. Thanks very much for your feedback.

Best regards,