An engineer and oceanographer, Derya Akkaynak, created an algorithm that removes the water from underwater images - meaning it takes away the haze & tint that come with most underwater photos. It doesn't require a color chart either (though it does need distance information it gathers through numerous photographs from different angles).
From the abstract:
The Sea-thru method estimates backscatter using the dark pixels and their known range information. Then, it uses an estimate of the spatially varying illuminant to obtain the range-dependent attenuation coefficient. Using more than 1,100 images from two optically different water bodies, which we make available, we show that our method with the revised model outperforms those using the atmospheric model. Consistent removal of water will open up large underwater datasets to powerful computer vision and machine learning algorithms, creating exciting opportunities for the future of underwater exploration and conservation.
Essentially, the algorithm goes through every pixel and color balances the image based on collected data on distance/color degradation.
As with most algorithms of this type - the more data we feed it, the better it gets. Sea-thru already has great applications for furthering ocean-based research, but the follow-up question is can this algorithm be extrapolated outward to deal with other atmospheric conditions outside of water - smog, etc.
What other uses can you imagine? It is not hard to imagine how this could be applied to market data either ... Interesting stuff!