From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.

  • ShortFuse@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    ·
    edit-2
    1 year ago

    The TL;DR is now pixels get tracked for how long they’ve been lit. Then the device can evenly burn out the other pixels so the usage is uniform. The trade off is you are going to lose max brightness in the name of screen uniformity.

    The other point is a shifting of the TFT layer that people think is burn-in, but it’s actually image retention. But, it is solved by these TV maintenance cycles. Just hope that this compensation cycle actually runs since some panels just fail to run them.

    Checkout this RTings video for a good overview of lots of different TV brands and how they perform.

    PS: Burn-in is actually a misnomer from the CRT era. There’s nothing burning in; the pixels are burning (wearing) out.

  • Whirling_Cloudburst@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    1 year ago

    This is a good read.

    On a side note: Anyone remember the story of the guy that went on vacation and his buddy watching the place left the gay porn on pause on the plasma screen as a joke?

  • Cossty@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I will not use oled monitors with desktop pc. With my usage, I would have burn in in 2 years if not sooner. Not counting that, I would still have that thought in my mind, that if I use it more, I will get burn in, and anytime I’m leaving the pc, even if only for a minute, I should turn it off. I like good contrast and blacks, so my next monitor will probably be good VA with local dimming.