What is HDR, anyway?
I did my PhD in Atomic, Molecular, and Optical (AMO) physics, and despite "optical" being part of that I realized midway that I didn't know enough about how regular cameras worked!
It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.
I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.
Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).
It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.
We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.
HDR on displays is actually largely uncomfortable for me. They should reserve the brightest HDR whites for things like the sun itself and caustics, not white walls in indoor photos.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
As a photographer, I get the appeal of (this new incarnation of) HDR content, but the practical reality is that the photos I see posted in my feeds go from making my display looking normal to having photos searing my retinas, while other content that was uniform white a second prior now looks dull gray.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
HDR is when you’re watching a dark film at night, looking at the subtle nuances between shades of dark and black in the shadows on the screen, making out the faint contours the film director carefully curated, and the subtitles gently deposit 40W of light into your optical nerves with “♪”.
> A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
> AI cannot read your mind, so it cannot honor your intent.
This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.
Just one other thing. In Analog you also have compensating developers, which will exhaust faster in darker areas (or lighter if you think in negative), and allow for lighter areas more time to develop and show, and hence some more control of the range. Same but to less degree with stand development which uses very low dilutions of the developer, and no agitation. So dodging and burning is not the only way to achieve higher dynamic range in analog photos.
About HDR on phones, I think they are the blight of photography. No more shadows and highlights. I find they are good at capturing family moments, but not as a creative tool.
So, HN, are HDR monitors worth it? I remember ~10 years ago delaying my monitor purchase for the HDR one that was right around the corner, but never (in my purchasing scope) became available. Time for another look?
The utility of HDR (as described in the article) is without question. It's amazing looking at an outdoors (or indoors with windows) scene with your Mk-1 eyeballs, then taking a photo and looking at it on a phone or PC screen. The pic fails to capture what your eyes see for lighting range.
Note for Firefox users - view the page in Chrome to see more of what they are talking about. I was very confused by some of the images, and it was a world of difference when I tried again in Chrome. Things began to make a lot more sense - is there a flag I am missing in Firefox on the Mac?
Having come from professional video/film tooling in the 90's to today, it's interesting to see the evolution of what "HDR" means. I used to be a purist in this space, where SDR meant ~8 stops (powers of two) or less of dynamic range, and HDR meant 10+. Color primaries and transfer function mapping were things I spoke specifically about. At this point, though, folks use "HDR" to refer to combinations of things.
Around this, a bunch of practical tooling surfaced (e.g., hybrid log approaches to luminance mapping) to extend the thinking from 8-bit gamma-mapped content presenting ~8 stops of dynamic range to where we are now. If we get away from just trying to label everyting "HDR", there are some useful things people should familiarize with:
1. Color primaries: examples - SDR: Rec. 601, Rec. 709, sRGB. HDR: Rec. 2020, DCI-P3. The new color primaries expand the chromatic representation capabilities. This is pretty easy to wrap our heads around: https://en.wikipedia.org/wiki/Rec._2020
2. Transfer functions: examples - SDR: sRGB, BT.1886. HDR: Rec. 2100 Perceptual Quantizer (PQ), HLG. The big thing in this space to care about is that SDR transfer functions had reference peak luminance but were otherwise relative to that peak luminance. By contrast, Rec. 2100 PQ code points are absolute, in that each code value has a defined meaning in measurable luminance, per the PQ EOTF transfer function. This is a big departure from our older SDR universe and from Hybrid Log Gamma approaches.
3. Tone mapping: In SDR, we had the comfort of camera and display technologies roughly lining up in the video space, so living in a gamma/inverse-gamma universe was fine. We just controlled the eccentricity of the curve. Now, with HDR, we have formats that can carry tone-mapping information and transports (e.g., HDMI) that can bidirectionally signal display target capabilities, allowing things like source-based tone mapping. Go digging into HDR10+, Dolby Vision, or HDMI SBTM for a deep rabbit hole. https://en.wikipedia.org/wiki/Tone_mapping
So HDR is everything (and nothing), but it's definitely important. If I had to emphasize one thing that is non-obvious to most new entrants into the space, it's that there are elements of description of color and luminance that are absolute in their meaning, rather than relative. That's a substantial shift. Extra points for figuring out that practical adaptation to display targets is built into formats and protocols.
A related article on dpreview about Sigmas hdr map in jpegs has some tasteful hdr photos. You have to click/tap the photo to see the effect.
https://www.dpreview.com/news/7452255382/sigma-brings-hdr-br...
Does anyone else find the hubris in the first paragraph writing as off-putting as I do?
"we finally explain what HDR actually means"
Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.
UIs are hardly ever tested in HDR: I don't want my subtitles to burn out my eyes in actual HDR display.
It is here, where you, the consumer, are as vulnerable to light in a proper dark environment for movie watching, as when raising the window curtains on a bright summer morning. (That brightness abuse by content is actually discussed here)
Dolby Vision and Apple have the lead here as a closed platforms, on the web it's simply not predictably possible yet.
Best hope is the efforts of the Color on the Web Community Group from my impression.
The dad’s photo in the end in SDR looks so much better on a typical desktop IPS panel (Windows 11). The HDR photo looks like the brightness is smushed in the most awful way. On an iPhone, the HDR photo is excellent and the others look muted.
I wonder if there’s an issue in Windows tonemapping or HDR->SDR pipeline, because perceptually the HDR image is really off.
It’s more off than if I took an SDR picture of my iPhone showing the HDR image and showed that SDR picture on the said Windows machine with an IPS panel. Which tells me that the manual HDR->SDR “pipeline” I just described is better.
I think Windows showing HDR content on a non-HDR display should just pick an SDR-sized section of that long dynamic range and show it normally. Without trying to remap the entire large range to a smaller one. Or it should do some other perceptual improvements.
Then again, I know professionally that Windows HDR is complicated and hard to tame. So I’m not really sure the context of remapping as they do, maybe it’s the only way in some contingency/rare scenario.
I find the default HDR (as in gain map) presentation of iPhone photos to look rather garish, rendering highlights too bright and distracting from the content of the images. The solution I came up with for my own camera app was to roll off and lower the highlights in the gain map, which results in final images that I find way more pleasing. This seems to be somewhat similar to what Halide is introducing with their "Standard" option for HDR.
Hopefully HN allows me to share an App Store link... this app works best on Pro iPhones, which support ProRAW, although I do some clever stuff on non-Pro iPhones to get a more natural look.
The article starts by saying HDR can mean different things and gives Apple's HDR vs new TV "HDR" advertsing but doesn't explain at all what the TV HDR means and how it is different, unless I missed something. I always assumed they were the same thing.
I have similar feeling with HDR to what I have with movie audio mixing. The range is just too big. I think this distaste also amplified in me with age. I appreciate content that makes use of the limited space it gets, and also algorithms that compress the range for me a bit. KMPlayer for example has this on for volume by default, and it makes home movie watching more comfortable, no doubt sacrificing artistic vision in the process. I feel a bit the same with the loudness war - and maybe a lot of other people feel the same too, seeing how compressed audio got. At the very least they don't mind much.
I really appreciate the article. I could feel that they also have a product to present, because of the many references, but it was also very informative besides that.
I do prefer the gain map approach myself. Do a sensible '32 bit' HDR edit, further tweak an SDR version, and export a file that has both.
Creative power is still in your hands versus some tone mapper's guesses at your intent.
Can people go overboard? Sure, but thats something they will do regardless of any hdr or lack thereof.
On an aside its still rough that just about every site that touches gain map (adaptive HDR as this blog calls them) HDR images will lose that metadata if they need to scale, recompress, or transform the images otherwise. Its led me to just make my own site, but also to handle what files a client gets a bit smarter . For instance if a browser doesnt support .jxl or .avif images, im sure it wont want an hdr jpeg either, thats easy to handle on a webserver.
I'm not entirely convinced that greedy influencers are to blame for people hating on overly bright content. Instead, I think something is different with how displays produce brightness compared to just the nature outside. Light outside is supposed to reach up to tens of thousands of nits, yet even 1000 nits is searing on a display. Is it that displays output polarized light? Is it the spectral distribution of especially the better displays being three really tight peaks? I cannot tell you, but I'm suspecting something isn't quite right.
All this aside, HDR and high brightness are different things - HDR is just a representational thing. You can go full send on your SDR monitor as well, you'll just see more banding. The majority of the article is just content marketing about how they perform automatic tonemapping anyways.
If anyone was hoping for a more technical explanation, I find these pages do a good job explaining the inner workings behind the format
https://docs.krita.org/en/general_concepts/colors/bit_depth....
https://docs.krita.org/en/general_concepts/colors/color_spac...
https://docs.krita.org/en/general_concepts/colors/scene_line...
Did I miss something or is the tone mapper not released yet? I admit I'm multitasking here, but just have an exposure slider in the image lab (or whatever it's called).
Sidebar: I kinda miss when Halide's driving purpose was rapid launch and simplicity. I would almost prefer a zoom function to all of this HDR gymnastics (though, to be clear, Halide is my most-used and most-liked camera app).
EDIT: Ah, I see, it's a Mark III feature. That is not REMOTELY clear in the (very long) post.
For the Halide's updated Image Lab demo about 2/3rd of the way down the page (https://www.lux.camera/content/media/2025/05/skyline-edit-tr...), you made the demo so tall desktop users can't both see the sky & the controls at the same time
A lot of these design flaws are fixed by Firefox's picture in picture option but for some reason, with the way you coded it, the prompt to pop it out as PIP doesn't show up
Is there a consensus definition of what counts as "HDR" in a display? What is the "standard dynamic range" of a typical TV or computer monitor? Is it roughly the same for devices of the same age?
My understanding is most SDR TVs and computer screens have displays about 200-300 nits (aka cd/m²). Is that the correct measure of the range of the display? The brightest white is 300 nits brighter than the darkest black?
As a photographer, one of the things that draws me to the work of artists like Bill Brandt and Daido Moriyama is their use of low dynamic range, and high contrast. I rarely see an HDR image that is aesthetically interesting.
So, if cameras have poor dynamic range, how are they getting away with a single exposure? They didn't explain that at all...
Half life 2 lost coast was exciting
Glad all this "Instagram influences searing eyeballs with bright whites" is news to me. All I know about is QR code mode doing that.
Isn't the result of their tone mapping algo similar to adjusting shadow and highlight sliders in other software?
Interesting these HDR controls of their app Halide. Does Android have similar apps?
Back in university I implemented a shoddy HDR for my phone camera.
The hardest part of it, by far, was taking hundreds upon hundreds of pictures of a blank piece of paper in different lighting conditions with different settings.
This was super awesome. Thanks for this! Especially the HDR photo reveal felt really awesome.
In my opinion, HDR is another marketing gimmick -- the average layman has no idea what it means, but it sounds fancy and is more expensive, so surely it's good.
As a non-techie I represent the 99.9% of the population who haven't a clue what tone mapping etc. is: NO WAY would we ever touch the various settings possible as opposed to watching the TV/computer screen/etc. as it came out of the box.
Is there any workflow that can output HDR photos (like the real HDR kind, with metadata to tell the display to go into HDR mode) for photos shot with a mirrorless and not an iPhone?
"The Ed Hardy T-Shirt of Photography"
Literal snort.
I chuckled at "The Ed Hardy t-shirt of photography" for the early, overdone "HDR-mode" images.
i am still skeptical about HDR as pretty much all HDR content I see online is awful. But this post makes me believe that Lux/Halide can pull of HDR in a way that I will like. I am looking forward to Halide Mk3.
This page crashed Brave on Android three times before I gave up.
Well crap, I had written a thank you, which I will gladly write again:
I love when product announcements and ADS in general are high value works. This one was good education for me. Thank you for it!
I had also written about my plasma and CRT displays and how misunderstandings about HDR made things generally worse and how I probably have not seen the best these 10 bit capable displays can do.
And finally, I had written about 3D TV and how fast, at least 60Hz per eye, 3D in my home made for amazing modeling and assembly experiences! I was very sad to see that tech dead end.
3D for technical content create has a lot of legs... if only more people could see it running great...
Thanks again. I appreciate the education.
I have a question: how can I print HDR? Is there any HDR printer + paper + display lighting setup?
My hypothesis are the following:
- Increase display lighting to increase peak white point + use a black ink able to absorb more light (can Vantablack-style pigments be made into ink?) => increase dynamic range of a printable picture
- Alternatively, have the display lighting include visible light + invisible UV light, and have the printed picture include an invisible layer of UV ink that shines white : the pattern printed in invisible UV-ink would be the "gain map" to increase the peak brightness past incident visible light into HDR range.
What do you folks think?
... something Linux Desktops don't understand and Macs only do well with their own displays with videos. Guess who the winner is on the desktop: Windows oO
Tldr; lots of vivid colors.
[flagged]
HDR is just a scene-referred image using absolute luminance.
> Our eyes can see both just fine.
This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.
However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.
So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.