Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But since you do not have access to colour originals of historical photos in almost every instance, you cannot possibly train the network to have any instinct for the colour sensitivity of the medium, can you?

An extreme example:

https://www.cabinetmagazine.org/issues/51/archibald.php

https://www.messynessychic.com/2016/05/05/max-factors-clown-...

Colourising old TV footage can only result in a misrepresentation, because the underlying colour is false to have any kind of usable representation on the medium itself.

And this caricatured example underpins the problem with colourisation: contemporary bias is unavoidable, and can be misleading. Can you take a black and white photo of an African-American woman in the 1930s and accurately colour her skin?

You cannot.



> Can you take a black and white photo of an African-American woman in the 1930s and accurately colour her skin?

AI colorization will, in general, be plausible, not accurate.


Yeah, the model is racist for sure. That's a limitation of the dataset though (celeb A is not known for its diversity, but it was easy for me to work with, I trained this model on Colab)

And plausibility is a feauture, not a bug.

There are always many plausibily correct colorizations of an image, which you want the model to be able to capture in order to be versatile.

Many colorization models introduce additional losses (such as discriminator losses) that avoid constraining the model to a single "correct answer" when the solution space is actually considerably larger.


In other words, bullshit.


No more so than any other colorization method that isn’t dependent on out-of-band info about the particular image (and even that is just more constrained informed guesswork.)

That's what happens when you are filling in missing info that isn't in your source.

EDIT: Of course, color photography can be “bullshit” rather than accurate in relation to the actual colors of things in the image; as is the case with the red, blue, and green (actual colors of the physical items) uniforms in Star Trek: The Original Series. But, also fairly frequently, lots of not-intentionally-distortive reproductions of skin tones (often most politically sensitive in the US with racially non-White subjects, where there are also plenty of examples of deliberate manipulation.)


Showing color X on TVs by actually making the thing color Y in the studio, well, filming, not bullshit. It's an intentional choice playing out as intended. It is meant to communicate a particular thing and does so.


That particular thing was not intentional, and is the reason why the (same color in person, different material) command wrap uniform that is supposed to be color-matched to the made-as-green uniforms isn’t on screen.

But, yes, in general inaccurate color reproduction can be intentionally manipulated with planning to intentionally create appearances in photos that do not exist in reality.


shrug people like looking at colorised photos because it helps root the image within the setting of the real world they occupy.

For some it’s more evocative, irregardless of the absolute accuracy.

Having a professional do it for that picture of your great grandad is expensive.

Having a colourisation subreddit do it is probably worse for accuracy.

I think there is a place for this bullshit.


The original color information just isn't there.

So bullshit is the best you're going to get.


Well, you could also not put more bullshit in the world by not doing the thing.


Why are you so negative about it? Pretty sure many people would find it impressive to colorize old photos to look at them as if these were taken in color.

Should artists not put their bs in the world? Writers? Musicians? Most of it is made up but plausible to make you feel something subjective.


People have been colorizing photos as long as there have been photos.


This is true, but if you have some reference images, you can probably adapt some of the recent diffusion adaptation work such as DreamBooth, to tell the model „hey this period looked like this“, and finetune it.

https://dreambooth.github.io/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: