High Resolution 3D Models of Formal Dresses

This was a final project for my internship at Queenly, a marketplace dedicated toward formalwear and pageant dresses. My goal was to create 3D renders of a collection of the dresses listed on the Queenly site and determine how to integrate them smoothly into the app. Because it’s difficult to fully capture all the various embellishments on many of these dresses, displaying 3D renders along with the dress listings allows buyers to have as much visual information as possible to make an informed decision.

Swift

Objective-C

XCode


The first challenge was to determine the best way of scanning dresses. I researched two different methods for creating models: LiDAR and photogrammetry. Lidar works by sending laser pulses at items in a space and then senses the reflected pulses to measure the distances between points. It uses a cloud of points with direct measurements to features to recreate a 3D scene. However, the models generated were blurry and low resolution – much of the embroidery and detailing work on the dresses were lost on the lidar scans.

To the right is a visualization of the LiDAR point cloud and below are the results of 3D scanning using LiDAR.

Using photogrammetry, on the other hand, allowed for higher resolution 3D scans. Photogrammetry is a method of using overlapping photos to reconstruct a 3D model of an object. Essentially, a series of high resolution photos taken from different heights and angles are processed to generate a 3D map with elevation, shape, texture, and color information. Using this data, a 3D model can be reconstructed.
My first attempt at photogrammetry was through Apple’s sample Photogrammetry Command-Line App. The app took a series of photos as input and would output a 3D model reconstructed from the photos. In order to obtain the best results, we shot the model against a solid background in a well-lit room, and turned it to have at least a 70% overlap between shots. These steps are key to making sure the program can recognize landmarks between photos and ultimately reconstruct the model.

Howevever, as we can see below, the models generated often had holes or extraneous parts in their mesh.

My next attempt at photogrammetry involved using Polycam, a popular 3D scanning app that can also reconstruct models through photogrammetry. The models generated by Polycam were noticeably more polished than those of Apple’s Command-Line App. The results were consistently high resolution and detailed. Whether scanning opaque, transparent, sequined, or patterned fabrics, Polycam yielded accurate 3D models, and seemed better equipped to handle the photogrammetry process in general.
I ended up creating 3D scans for about 10 dresses, featured below.


The next step was integrating the renders into the app. To render a 3D model in iOS, I used Apple’s 3D graphics framework, Scenekit. I created a displayable 3D scene using the SCNScene class, which is essentially a hierarchy of nodes that contain different attributes to represent 3D visuals.

I then created an SCNView object and set its scene property to the scene just created, and then made it visible by adding a light source. Since SCNScenes are composed of nodes, I did so by creating an SCNNode and setting its light property, and then adding the newly created node as a child node of our scene’s root node.

Lastly, in order to allow the user to control the camera of the scene, I configured the camera control properties of the SCNView object.

Feel free to check out this Medium post I made on high resolution 3D scans for more information!