We've been wondering about the mental health of Artificial intelligence. OpenAI has released their state of the art neural network language model named CLIP. Just as a human is able to see an image and describe its contents, CLIP can complete a wide variety of image classification tasks. To try and get a feeling of how it’s coping we performed an examination, displaying for CLIP a series of Rorschach tests (a psychological test in which subjects' perceptions of inkblots are recorded and used as a gauge of emotional function). Out of respect we are just presenting CLIP's response without interpretation. Made using the AscendingCLIP notebook by @Advanoun. This is a collaboration between @AdamBroomberg and @IsaacSchaal