As part of the Wisewands project, I created a web application called Patchwork. When we first started developing this project and envisioned producing complex, textured artwork (and not simply the cartoon-style artwork that is prevalent among trait-based NFT collections), I recognized that the typical freeware solutions out there were not going to work. Most solutions force you to isolate a trait on a single PNG file and to determine the probability that it would render for any given NFT. In our case, we had multiple layers for each trait, and needed the logic to mix and match different layers for a particular trait to correspond with other traits.
For example, we have a wizard that can be wearing a hood or a hat, can have different lengths of hair, and different combinations of facial hair, also in different lengths. The layer for the hood that matched long hair and a long moustache but not a long beard is different than the layer that matched short hair and a long beard. Different combinations required the ability to select layers intelligently, and in some cases, there was a combination of foreground layers (e.g., the front part of the wizard's hair) and layers closer to the background (e.g., the part of the hair that fell across the shoulders and down his back). No publicly available software out there had the capability to manage this level of complexity.
I also wanted the ability to make some traits conditional on other traits. For example, the chance of a wizard having a tattoo on his chest is only triggered if the wizard isn't wearing an outfit covering his chest. Similarly, a wizard can only have a cape in the style of Dr. Strange if he is wearing a shirt that corresponds to the cape and not, for example, a robe. I also wanted to change the probability of certain traits depending on other traits. So, for example, the probability of a wizard having crazy eyes (e.g., all black or all white) is very low, unless the wizard has a death affinity. In that case, the probability of crazy eyes is higher. With this in place, I could increase the probability that certain traits matched compatible traits but still allowed those traits to appear sporadically in other cases.
Patchwork is a web app, so the processing occurs on a server and not on your computer. It includes all the steps from setting up traits and probabilities to assembling the final artwork, then exporting the metadata in various formats. It can then upload and pin the image data and metadata to Pinata in preparation for the smart contract. Patchwork isn't publicly accessible yet, but if you need a tool like this, feel free to inquire. Here are some highlights of the current beta version, which has successfully been used to prepare our project.
The interface is designed to be simple (in terms of graphics) and is largely drag-and-drop in a hierarchical tree-view format. You create a project and then create a number of trait groups. Each trait group is processed by the app from top to bottom as the collection of traits for a particular NFT is generated. So if you want some traits to be dependent on other traits, you need to make sure the principal traits are at the top or beginning of a thread (a "thread" is a collection of traits that are dependent on each other, left to right). If the color of the NFT's hat needs to match the color of its clothing, you would make sure the clothing color is determined before the hat.
In the example, you can see that Outfit is a trait group. Under it are three traits: None (i.e., bare-chested), Robe, and Leather Armor. "None" has odds of 2 assigned, Robe has 10, and Leather armor has 10. This means there is a 2 out of 22 chance (9%) that the NFT will not be wearing clothing. Let's assume the app randomly determines the NFT is wearing a Robe. Following the thread, you can see that the next trait group below this is whether or not there is a tattoo on the NFT's arm. Since the NFT is wearing a robe, the only options available are None or Lower arm. (If the NFT were bare-chested, a different trait group would be triggered that included the possibility of tattoos on the Upper arm.) There is a 75% chance that the NFT will not have an arm tattoo. Let's assume that's the selection. The next trait group is the Head Covering. Note that this trait group is specific to the robe; this is because robes can include a hood, but other outfits cannot. You can create any number of trait groups and attach them to other traits by simply dragging and dropping them. Trait groups can be attached to more than one trait, if it's not unique to a particular thread. This effectively creates a mirror image (not a copy) of the same trait group, though there are also ways to make copies of traits and colors in order to speed up your workflow.
Each trait can be clicked on and certain characteristics designated. This includes a tag (which is a shorthand way of associating a trait beneath this node in a way that is relevant to the graphic layers), a name (which is the value that is passed up to a tag above this node), a child trait group (which is the trait group beneath this trait in the thread), and the odds that this trait will be selected. In the example, the trait is a hat. It sits under a parent node that has the tag "headcovering." Also under this parent node are None and Hood. You can also set up custom odds that are conditioned by traits that were selected earlier in the thread. In this example, the odds will shift from 2 to 5 if the affinity trait (associated with the "affinity" tag) is Death, which was previously determined in the thread. The Prev and Next buttons allow you to cycle to other nodes at the same level, so that you can assign odds to a collection of traits without having to return to the tree-view.
When a collection of NFTs are rendered, the first step is collecting a series of traits for each, based on the odds and logic designated on this screen. The tree-view is really a decision tree; the app starts at the top and continues until it reaches the end, selecting traits based on the odds you assigned. If you are generating 10,000 NFTs, there will be 10,000 passes through this tree—10,000 threads that will result in trait combinations for 10,000 pieces of art.
Once you have the logic set up for traits, you need to associate the various tags with the layers of your art. The assumption is that you have a complex Photoshop document with any number of layers that correspond not only to traits but also combinations of traits. So you may have multiple layers for a trait including layers that change depending on other traits. In our project, for example, we have layers for a hat that differ depending on the NFT's hair length and whether or not the NFT has sunglasses on. Each of these traits must have a tag designated on the Traits page in order to match the value with a layer. So for example, the NFT might have headcovering=Hat, eyewear=Sunglasses, hair=Long, etc.
In Photoshop, you will need to export each layer as a PNG with transparency. There is a built-in script to do this, and also a very good one that you can find on the web and import. It will run through all your layers and save them off individually, giving them the name you assigned the layer. These can be imported into Patchwork and will appear on the Layers page. The layers should be imported in the same order that they are in Photoshop, but from bottom to top (Photoshop normally has the lower layers at the bottom). The background will be the top layer in the list, for example. (When the app is constructing your art, it will start at the very back and stack layers to the one most in the foreground.) Patchwork allows layers to be dragged and reordered, and you can invert the order of all the layers.
When you click a layer, you will see a screen that allows you to designate the conditions under which this layer appears in the art for a given NFT (based on the designated traits). In the example to the right, the layer file is called robe_hood_down_blue.png. This is a Photoshop layer that shows a blue robe without a hood. Below this are conditions which cause the layer to display and exceptions which prevent it from displaying. The conditions must all be true for the layer to display. Those conditions are outfit=Robe and outfitcolor=Blue (outfit and outfitcolor are tags we assigned to those traits back on the Traits page, and will automatically be offered as selections on this screen). The exception is headcovering=Hood. If any exception applies, this layer will not be included in the final art. Since there is a different layer that corresponds to a blue robe with a hood, we don't want this layer to be included if the NFT has a hood. Note that there is a way to create conditions that are either-or to match any of multiple traits as well as exceptions that only occur when a combination of traits together exclude the layer (both-and).
The Prev and Next keys let you save and move directly to the next or previous layer, permitting you to add the same condition or exclusion rapid-fire to a series of relevant layers.
The Generate page allows you to test your work from the previous two pages. It runs through your decision tree, assembling a collection of traits (each trait being a tag and a value, such as outfit=Robe or headcovering=Hat) for one sample NFT based on the odds you assigned. It then takes those and includes the layers that match the trait selection, and displays the resulting artwork on the page. Every time you click the Build button at the top, a new series of traits will be generated and a matching image displayed. To the right of the image is the collection of traits, and below it is a list of the layers that matched those traits, based on the conditions and exclusions you designated on the Layers page. You can manually change the list of traits and then click the yellow Build button to check the results.
This page is the workhorse of the app, and you don't need to use it until you're confident that you have the traits set up correctly and the layers are rendering properly. It performs a number of tasks from generating traits for your entire collection, creating the artwork, and giving you reporting on the rarity of traits and NFTs. The reporting uses two different algorithms for calculating rarity so that you can see which NFTs in your collection are the most valuable to investors.
The top list allows you to create traits, optionally include images, and assign rarity stats to each. The app currently generates traits and artwork at the rate of one NFT per three-quarter second. (At some point I will move this to a background task and it will undoubtedly go more quickly.) The bottom list offers various reporting and data export options. You can view rarity stats for your collection, including traits and the artwork for each NFT, in a list. You can export this as a PDF for review offline. You can also export the metadata in JSON (as individual files) and Excel (as a table with all traits as columns and a link in each row to the corresponding art file).
The rarity reporting is useful to help you assess the way people will value your NFTs. It uses two standard algorithms and orders your collection based on ranking, showing each trait associated with each NFT and the percentage of NFTs in the collection that share that trait. Note that you can assign a friendlier name to the tags used on the Traits page prior to export, and if there are traits that are used in generating the artwork that are not relevant for the metadata, you can exclude those from the output.