What are the t-shirts made of?

We use the Anvil 980, which is made of a thick cotton with a high stitch density. With a tubular construction, it's less fitted than other t-shirt models. All technical talk aside, it means that this classic tee is very comfortable and durable. 100% ring-spun cotton. Fabric weight: 4.3 oz/yd² (145.79 g/m²). Pre-shrunk, shoulder-to-shoulder taping. Double stitched sleeves and bottom hem.

Are the doodles printed or embroidered?

Embroidered! It costs us a bit extra, but we think it gives each t-shirt a premium look and feel.

Do you ship worldwide?

We have a limited number of countries we ship to outside of the United States. If you're not sure if your country is one of them, feel free to reach out to us via our contact page. Down the road, we hope to add additional international shipping locations.

Do you offer free shipping?

We sure do! Please note, all of our shipping options do not factor in the time it takes for our manufacturer to print and fulfill the order. It typically takes 5-8 business days after fulfillment for you to receive your t-shirt(s).

Contact info?

Visit our contact page, fill out the form, and we'll get back to you ASAP.

The Science

Paul, our esteemed engineer trained a variational autoencoder (machine learning model) on Google Creative Lab's Quick, Draw! Dataset. He chose to use the model architecture from the Sketch RNN paper (cited below), and trained the model on NYC themed drawings.

How does it learn? The model encodes each drawing using a recurrent neural network (which is a neural network that is especially good at understanding sequences - in this case lines that make up a drawing). From that encoding the model learns to generate a distribution for each line in the to-be-created drawing. It trains itself by “sampling” from these distributions to generate new drawings and comparing them to the real drawings. There’s more nuance than that, but this is the general idea.

How does it draw? When the model generates a new drawing it takes a sample from the distribution of all potential first lines. After it has the first line it takes a sample from the distribution for the second line taking into account the first line it generated, and so on. In the end, the model behaves kind of like a human. It draws line after line until it hits a sufficient number of lines and ends with a completed drawing! 

The People

So who are "we"? Paul Blankley is an AI expert who is currently building an AI-powered data preparation tool with his co-founder, Ryan Janssen. His interests are eclectic and range from teaching Generative Adversarial Networks (GAN’s) how to create art to writing about near earth asteroids (see his asteroid clustering paper) to rock climbing and mountaineering (for all those mountains in NYC).

Tyler Becker leads community at Betaworks Studios, a clubhouse for nerds. He introduces members to the right people and most relevant resources in a startup-concierge format. Tyler pats himself on the back for his travel planning skills, optimism, and ability to write in the third-person. You can likely find him on the weekend playing street hockey or getting Seamless delivered to a dive bar.

Sarah McBride is a marketing specialist and content creator deeply passionate about GIFs. And field hockey. And telling people that she's from Northern Ireland. 

The Credits

Betaworks: for introducing us all to this synthetic future and inspiring us to build a side project to explore it further... so get yourself a t-shirt, won't you?

Google Creative Lab's Quick, Draw! Dataset: for creating an open-source project from which we could source our training data from 

A Neural Representation of Sketch Drawings: For publishing the original research on which our project is based.

Alexis David Jacq: For open sourcing his work and saving Paul implementation time.

Product Hunt: for building a platform for us to ship on and get inspiration from.

Ryan Janssen: Paul's patient co-founder and our 1st customer.

hipsterbusiness.name: for artificially generating our name + logo.

Text To Transformer: for giving us endless laughs and occasional copy ideas, all AI-generated and trained on the OpenAI CPT-2 language model.