This work focuses on images generated with Facebook DensePose, an open-source pose estimation tool. Pose estimation uses computer vision, which trains machines to identify and classify objects, to detect and track the location of human figures in visual material. DensePose exceeds basic object recognition; it detects human bodies in images and constructs a 3D surface-based model of the body. Also public is DensePose’s training image set of 50,000 people. Typically, the application of the tool is transporting one body’s texture onto another’s pose. The outputs in this project are instead reconstructions of a singular body onto itself.
Images were taken from e-commerce entries for swimwear and run through two tools: the first identifies the body and flattens it into a texture of segmented body parts, and the second reconstructs the model based on the body’s position. In the resulting output, image, surface, and body are placed onto a plane where their characteristics clash, but are also indiscernible from one another. The effect of the technology emerges in the cracks, mistakes, and lost data of these generated bodies.
Treating the bodies as images, they are projected onto prefabricated swimwear patterns and output and sewn by a print-on-demand company. The material of swimwear is mutable, synthetic, and body conforming; the intertwining of swimwear and body shows the divisions and overlaps between synthetic and natural form. A live video presentation of the swimsuits, in which a model moves through poses, shows her struggling, obeying, indulging, and refusing in front of us.