Hirox, a leading provider of 3D digital microscope technology, has painstakingly created the world’s largest 3D scan, a super-detailed 108-gigapixel scan of Johannes Vermeer’s iconic painting, “Girl with a Pearl Earring.”
https://www.hirox-europe.com/gigapixel/girl-with-a-pearl-earring/#TRPLINKPROCESSED
Hirox website that let’s you interact with their scan of the painting
If you check it out, don’t forget to have a look atthe somewhat hidden 3D mode. Though well made, the 2D mode is just a Google-Maps-like view, and the 3D mode is entirely different.
Cool, STL?
I was zooming in an out of the 3D image of the pearl and watching it transition between the very recognisable pearl, zoom in to see it transform into an Arctic alien landscape, and then zoom back out again to see the pearl.
I did the exact same thing!
Guys you just have to look at the earring, if its a star Redd is trying to sell you a fake
From an art science perspective, this is indeed interesting. I don’t think it’s particularly helpful, nor required, for appreciation of the subject matter. Very cool.
It’s a way to create a digital snapshot to preserve art even if the physical edition is lost. That’s important, in my opinion.
It allows to monitor the deterioration of paintings and to better understand potential risks. It’s a massively helpful tool for preservation efforts.
What a great job they did! This would have been a ton of work to develop the rig, stitch and normalize.
It really looks amazing
Thank you!
We’re going to get so many HQ copy memes
Infrared multispectral imaging would be interesting to see what is under the surface, if there are any pentimentos underneath.
Can someone please ELI5, felt like I was reading the wrong language there.
“Pentimentos” just refers to the five-sided breath mints widely known to have been Vermeer’s favorite. Often trace amounts will turn up under a microscope. It’s how they know it’s authentically his.
Don’t know if you’re trolling, but pentimenti are small changes or corrections artists do, while drawing/painting.
I kind of expected a ryzen die shot tbh
Wonderful! The mouth is simply fantastic. And you can clearly see that he draw most of the head shape with umber or a similar color. Really interesting.
I really hope they’re able to scan more artwork in the future
I will scan the scream at the Munch museum in March 😱
Anybody know if I can download this with Dezoomify or something?
I doubt you want to. Its probably at least a terabyte.
Nothing a little more jpg couldn’t fix.
I doubt the one that they display in their website is a terabyte, because that would require a ton of traffic just to get the render in the browser for each visitor they get.
(I’m sure that isn’t the full resolution one, but it still looks pretty good)
Such an insane tech
Sometimes it’s pretty cool living in the future.
During the opening of the exhibition and release of the website!
That’s spectacular. It must be impressive seeing that much detail in person.
Did you have a look at the video too? https://youtu.be/j_MvpMlgfwI?si=5eCuFoSirK85LOND By the way, if you have any questions about this project, feel free to ask :)
Here is an alternative Piped link(s):
https://piped.video/j_MvpMlgfwI?si=5eCuFoSirK85LOND
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
They should create an iPad (and/or iPhone) app which shows the painting in “3D” by adjusting the 2D image according the viewing angle tracked by faceID cam
Android powers over 70% of smartphones. If they were to make an app wouldn’t it be more efficient (cost v exposure) to make an android app? The website though works on all phones.
And you wouldn’t really need faceId or any other depth aware camera for this. Just a normal front-facing camera will do. Or do accelerometer only and tilt the phone instead
Maybe nowadays with the powerful phone processors, you don’t need depth aware cameras anymore for this. But I can remember, that, at the time faceID was introduced, there was faceID (or two normal cameras) needed to do this on phone.
I guess if you look at targeted people who may be interested in this app you are nearly at 80% using iPhones. Most of the 70% using android are from developing countries and l bet most of those user aren’t that interested in art like this 💁🏻♀️
poor people can’t enjoy art now? what you’re proposing is artificially creating an economic barrier to access art just because you consider that people who are born in developing countries aren’t interested in art.
also, there are a tons of people who buy android phones even if they can afford an iPhone because Apple’s devices are a golden jail.
Sure, I have the same opinion, but I still think most app user would have iOS if such app would exist right now on both platforms.
This example https://trekhleb.dev/blog/2021/gyro-web/ demonstrates how a 3D view can be orientated by tilting your phone. It is implemented using web technology.
It uses the device gyroscope to detect rotation and tilting. This is frequently used in map / street view. No faceID or depth camera is required.
Lol, using this, you have to always look straight to the phone and rotating it. This is bat user experience compared to the faceID solution, where you can move your head and the phone freely