📣 In this issue:
Hyper-realism in oils, symbolic sun and stars, and pretending to be ChatGPT.
📰 News-To-Know
1
Fortune delves into the recent disturbances within the AI industry, specifically focusing on two significant events at Inflection AI and Stability AI. These upheavals have raised questions about the broader implications for the AI sector.
Inflection AI, co-founded by Mustafa Suleyman of Google DeepMind fame, witnessed a major shift as Suleyman and key members of his team moved to Microsoft. This pivot also refocused the company towards corporate clients. Originally, Inflection AI had made waves with its chatbot assistant, designed to manage daily tasks and provide emotional support, backed by an impressive $1.5 billion in funding and a valuation of $4 billion.
Stability AI, on the other hand, faced its own set of challenges. Emad Mostaque, the CEO who had positioned the company at the forefront with its text-to-image generator Stable Diffusion, stepped down amidst disputes with investors, a staff exodus, and unique claims of his past as a secret agent for the U.K. government. Prior to his departure, the company, valued at $1 billion in 2022, was already under scrutiny regarding its future direction.
These events have sparked a conversation on whether these are isolated incidents or indicative of a larger trend towards a "reality check" in the AI industry, which has been riding a wave of venture capital and high valuations with minimal revenue returns. The article suggests that the AI sector might be at a turning point, as investors begin to question the sustainability of high expenditures without corresponding revenue streams. It also highlights the industry's need for a more focused approach, as the market does not require an oversaturation of AI chatbots and image generators.
2
AI diffusion models for image generation are getting 30x faster. MIT has introduced a groundbreaking framework that enables the generation of images in a single step, as opposed to the traditional multi-step approach. This innovation employs a "teacher-student model," where new models learn from the behavior of older ones, significantly accelerating the image generation process. The technique, known as distribution matching distillation, not only speeds up the process but also maintains, if not improves, the quality of the generated images (and it requires considerably less computational power!).
3
Weird and nonsensical images are “destroying Facebook”. Images such as “Shrimp Jesus” gain traction in the algorithms because they garner views and comments.