Reproduce paintings that make an impression



[ad_1]

Empty frames hanging inside the Isabella Stewart Gardner Museum are tangibly reminiscent of the world's largest unsolved art theft. Although the original masterpieces can never be found, a team at MIT's Computer and Artificial Intelligence Laboratory (CSAIL) might be able to help you with a new system for designing reproductions of paintings.

RePaint uses a combination of 3D printing and in-depth learning to authentically recreate favorite paints, regardless of lighting conditions or location. RePaint can be used to remake works of art for a home, protect the originals from wear in museums or even help businesses create prints and postcards from historical pieces.

"If you just reproduce the color of a painting as it appears in the gallery, it may be different at home," says Changil Kim, one of the authors, in a new article on the system, which will be presented to ACM SIGGRAPH Asia in December. "Our system works in all lighting conditions, which offers a much better color reproduction capability than almost any previous work."

To test RePaint, the team has reproduced a number of oil paintings by a collaborating artist. The team found that RePaint was more than four times more accurate than the most modern physical models to create the exact color shades of different works.

At the present time, reproductions only represent the size of a business card, because of the expensive nature of printing. In the future, the team hopes that more advanced commercial 3D printers could help make larger paints more efficiently.

While 2D printers are most often used to reproduce paints, they only have a fixed set of four inks (cyan, magenta, yellow and black). However, the researchers found a better way to capture a more complete spectrum of Degas and Dali. They used a special technique called "color matching", which involves using a 3D printer and 10 different transparent inks stacked in very thin layers, much like wafers and chocolate from a Kit-Kat bar . They combined their method with a decades-old technique called halftone, in which an image is created by many small colored dots rather than continuous tones. According to the team, the combination of these elements makes it possible to better understand the nuances of the colors.

With a wider color palette, the question of which inks to use for which paints have remained. Instead of using more laborious physical approaches, the team has formed a deep learning model to predict the optimal stack of different inks. Once the system mastered the problem, it nurtured images of paints and used the model to determine which colors were to be used in which particular areas for specific paints.

Despite the progress made so far, the team claims to have some improvements to make before they can produce a dazzling duplicate of "Starry Night". For example, mechanical engineer Mike Foshey said that it was impossible to completely reproduce certain colors like cobalt blue due in a limited ink library. In the future, they plan to expand this library and create a paint-specific algorithm for ink selection, he says. They can also expect better details to take into account aspects such as surface texture and reflection, in order to achieve specific effects such as glossy and matte finishes.

"The value of works of art has grown rapidly in recent years. It is therefore increasingly common for works of art to be locked in warehouses away from the public, "says Foshey. "We are building technology to reverse this trend and create accurate, inexpensive reproductions that everyone can enjoy."

Kim and Foshey worked on the system alongside lead author Liang Shi; Professor MIT Wojciech Matusik; the former MIT postdoc, Vahid Babaei, now group leader at the Max Planck Computer Institute; Szymon Rusinkiewicz, Professor of Computer Science at Princeton University; and MIT's former post-doctorate, Pitchaya Sitthi-Amorn, who is now a lecturer at Chulalongkorn University in Bangkok, Thailand.

This work is funded in part by the National Science Foundation.

[ad_2]
Source link