Janelle Shane studies artificial intelligence and posts the funniest results, as we've seen before. After several attempts at training algorithms to generate recipes, and even baking brownies full of horseradish (shown above), she has completely given up all hope.
I’ve seen neural net recipes that call for crushed sherry or 21 pounds of cabbage. One of my personal favorites is a recipe called “Small Sandwiches” that called for dozens of fussily chopped, minced, and diced ingredients - before chucking them in the food processor for 3 hours. Part of the problem has been neural nets with memory so terrible that halfway through the recipe they forget they’re making cake.
More recent neural nets like GPT-2, given better long-term memory and extensive training on a huge portion of the internet, can make recipes that are more likely to pass for the real thing. Use talktotransformer.com to prompt GPT-2 with “Black Forest Cake. Ingredients:” and the quantities and ingredients will be reasonable, even if the whole thing doesn’t quite work (generating a few examples in a row, I saw some Black Forest recipes that called for kneading the batter, and one that suggested pouring the batter into a toaster).
In her latest post, Shane passes along AI recipes for Crock Pot Cold Water, Chocolate Chicken Chicken Cake (which contains chicken but no chocolate), and Completely Meat Circle. How could you possibly make these recipes worse? Shane proposes to next have a neural network study vintage American recipes, as in those abominations that rely heavily on Jell-O and condensed soup. You can submit your suggestions for that experiment. -via Metafilter