Supplementary Materials Method for facial feature replacement

Download 33.5 Kb.
Size33.5 Kb.
Supplementary Materials
Method for facial feature replacement

The stimuli used in Experiments 2 and 4 included faces that were modified by changing facial features. These feature changes were mostly done by copying and pasting facial features from "donor faces" to the "target face" (the face that we want to change). For some features it was not possible to copy and paste features; for example, to change eye-distance we didn't want to copy the portion of the eyes from the donor face (the horizontal strip at the level of the eyes), because that would result in changing the eyes themselves as well, including features like eye-shape and eye-color, and not only the distance between them. The same is true for other dimension-related features like jaw-width or face-proportion. In these cases, the original face was changed, so that the changed features would resemble as closely as possible the donor feature.

All these face feature manipulations were done using the Adobe Photoshop © software by a professional graphic designer. To guide the graphic designer on how to change features we used "worksheets", as shown in Figure S5. These worksheets were prepared based on the results of the feature tagging obtained in Experiment 1A. The worksheets are arranged as follows: on the left hand side we present the target face, and below it a (partial) list of its feature values (we have 5 features per worksheet, a total of 4 worksheets per face). Let us look for example at the feature MouthSize, which has a scale of small-large (see Table 1). The target face has a value of 0.649 in this feature, which is a positive value, meaning that this face's mouth is larger than the average mouth found in the database. So, in order to achieve a maximal feature change (as was our goal in Experiment 2 and in the "far" conditions in Experiment 4), we want to replace it with a small mouth. On the right hand size of the worksheet we present faces from the database, ordered in columns, one column per feature, and below each face we display the serial ID of that face in the database, and the value of that particular feature in that face. In each column we show three faces, sorted by the (absolute) magnitude of their feature value (in Figure S5 we show only one face per feature, because the terms of use of the color FERET database limits the number of faces we can show in one paper). For example, in the column "MouthSize" (second from the right) we have faces with small mouths, the top one (who's ID in the database is 57) having the value of -3.13. This face has the smallest mouth in the database, and is a good candidate for donation to the target face. The other two faces in the MouthSize column have the next smallest mouths in the database. The graphic designer chose the feature from the donor face that fit best to the target face. The fitting criteria was that the resulting face, after the feature replacement, should look as natural as possible (we will explain later how we measure that), and require as little image editing effort to blend the pasted mouth into the target face. In some cases this "copy-paste" procedure was straightforward, like in changing eye-color, while in other cases it required more work. For features that could not be changed using a "copy-paste" procedure, such as eye-distance, or other dimension changes, we asked the graphic designer to change the original face so that it would resemble as closely as possible the donor face. Again, the donor face was available for the graphic designer using the same worksheet, and the graphic designer was instructed to make the feature in the target face as similar as possible to that of the donor face (we explain later how we test this). An example of the feature replacement process is shown in Figure 5 (main text).

We used two measures to evaluate the resulting changed faces:

First we measured how natural looking are the resulting changed faces. To do this we prepared a few versions of each changed face, and then compiled a list of face images, half of which were original (untouched) faces from our database, and half were changed faces. We presented subjects (20 students from Tel Aviv University) with these images (in random order), and asked them to judge each face whether it was "Photoshopped" or not, on a scale of 1 to 5. To explain to subjects what we mean by a "Photoshopped" face we presented Photoshopped faces of famous Israeli celebrities (for example - former Israeli president Shimon Peres with Tom Cruise's hair). We used this procedure to select for each changed face the version which was judged as least Photoshopped of all versions. Notably, some of the original faces were judged as Photoshopped, while some changed faces were judged as natural, indicating that: a - for unfamiliar faces it is not easy to determine if a face is natural or not, and b - the resulting changed faces did not "exceed the boundary" of the space of natural faces. We conclude that the fact that we copied features from donor faces, and did not change features arbitrarily, helped to maintain the natural appearance of the changed faces used in our study.

As a second measure we used the correlation between the values obtained in the re-tagging of the features of changed faces (see main text), with their values in the donor faces. To do this we re-tagged the changed faces, by repeating the face-tagging procedure described in Experiment 1 (i.e. asking subjects to assign values to feature in the changed faces. These subjects were different than the subjects that tagged the original faces). This enabled us to compare between the feature values when those features were on the original (donor) faces, and when they were on the target faces. Getting similar tagging would mean that the graphic designer successfully changed the feature of the target face to make it resemble the feature of the donor face. The results of this test are shown in Figure S6, where we show the correlation between the predicted face-space distances between the original and changed faces, and the actual face-space distances. The predicted face-space distances were calculated by creating feature vectors for the changed faces, based on replacing the values of the changed features with the values of the donor features, and measuring the distance between the resulting vector and the feature vector of the original face. The actual face-space distances were calculated using the feature vector of the changed face after retagging. The correlation between the two measurements was high and significant (r = 0.74, p < 0.001), indicating that the changed features were perceived as similar to the donor features, and therefore that the feature changing procedure succeeded in copying the features from donor to target faces.

Share with your friends:

The database is protected by copyright © 2019
send message

    Main page