In the raw footage the wavefront will almost always appear to move from the closer points to the further points simply because it takes time to propagate, and we see that in their example here. This is now somewhat misleading, as they're synthetically moving the camera in post (but not the pulse propagation). I point this out because the team have also developed an "unwarping" technique to cancel it out, and they demonstrate it at the bottom of their website. Note how the scattered light now propagates outward intuitively.
https://anaghmalik.com/FlyingWithPhotons/
https://anaghmalik.com/FlyingWithPhotons/media/moving_videos...
https://arxiv.org/pdf/2404.06493 is the paper. If I'm understanding it correctly, the camera isn't actually capturing a pulse of light. Instead, it's recording single pixels from a 10MHz series of pulses using a single pixel camera that rotates around the object. Then uses this time-series of data to render a video of a "single" virtual pulse via a NeRF.
The "AI" in the title appears to be click bait since the paper doesn't mention AI, and a NeRF isn't really AI in the colloquial sense even though it uses a DNN.
Hmm, if AI is involved I'm always wondering whether what I see is realistic or not.
Did Coca-Cola sponser/fund this study? Why the need for the label still being visible? Seems like you'd want to not obstruct the view behind the label, you know, for science. There's zero purpose for having a bottle with any label. The shape of the bottle is part of their trade mark, so it would be obvious anyways.
Apparently, I'm really sick of constant bombardment from corporate branding.
I thought I could only see photons that hit my retina :)
Why does the refraction appear instantly below the bottle rather than taking time for the light to propagate there?
[dead]
For those curious, here is prior art from 12 years ago, capturing light in a coke bottle with a single streak camera.
https://www.youtube.com/watch?v=EtsXgODHMWk
https://web.media.mit.edu/~raskar/trillionfps/
I remember when this dropped and where I was when I watched it and read the paper. One of the coolest things I'd ever seen.
This seems like a great extension of the work. I'm okay with trading accuracy for crude usefulness as a model. Making this interactive and putting it in the hands of curious minds is the next step.