Texture synthesis is done with three different methods:
Comparisons shown below:
Random sampling![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Random sampling is of course terrible, and the SSD with min-error cut is generally the best. This is because the cut allows images to account for non-linear shapes between patches. Better seams between patches help, such as for the radishes, since the seams tend to go between radishes and not through them.
Images which were textures already turned out well, as expected. The SSD generally re-pieced together the original texture, so the min-error cut did not have to do much.
I tried using greater overlaps with SSD and min-error cut, since I thought that would give the cut greater freedom in choosing a seam. However, the results actually seemed worse, because the greater overlaps meant the average SSD was greater and more varying. This meant patch picking via SSD was somewhat random, and as a result the patches would be too different for a good seam cut to even make a visual improvement.
Even the brick image which was already a texture, synthesized an inaccurate looking brick wall.
SSD![]() ![]() |
![]() ![]() |
Texture transfer is done by iteratively placing patches of a texture on a destination image, adjusting an "alpha" parameter to cause patch matches to iteratively get closer to matching the destination image.
The algorithm starts with large patches, focusing more on matching patches with neighboring patches than matching patches to the destination image. The next iteration, patch sizes are reduced, and it focuses more on minimizing the error to the destination image. This allows it to match the destination image, because focusing down on the patch edges and texture aesthetics. The equation used is the following, where alpha starts out small and grows larger:
minError = alpha * (neighborPatchError + previouslySynthesized) + (1 - alpha) * destinationImageError
To place patches, it finds the minimum SSD to neighboring patches, the previously synthesized image, and the destination image, then cuts based on the error with neighboring patches
Texture | Destination | Result |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
I tried using more textures which aren't as uniform. I actually tried using an image as a "texture", but the result was horrid. The lack of uniformity between the patches made everything look like a mess.
It was interesting how the center of the apple got trapped into using patches which ended with a dark strip on the bottom. As soon as one patch was used which matched the top and left overlaps, but had a dark lower part, the following patches followed suit. It may have been better to match intensities, rather than with each channel separately.
For images which had too much detail, the textures were unable to capture the detail, but a scaled down image seems to very closely represent the previous image. The texture at least follows the low frequencies, although the high frequencies were used to match the patches together.