- Future Tools
- Posts
- A peek into the “black box”
A peek into the “black box”
Everything you need to know in AI this week
Hey there! There’s so much to love about the current state of AI—the infinite innovation, the momentum, the pace…but the best part? The meme energy. Imaginations are going wild with DALL-E 3: Pope Vader or a Jesus selfie, anyone?
Anyway, here are the top AI headlines you probably missed this week. :)
—Matt
Anthropic’s Neural Breakthrough
Source: Anthropic
Neural networks have been central to AI’s innovation and success over the last few months. Mimicking the interconnected network of neurons in the human brain, artificial neural networks power AI models that run on vast amounts of data (like chatbots and image generators). And yet, because of the “black box” nature of neural networks, no one fully understands how each neuron contributes to the model’s final output—but that’s about to change.
Why? Anthropic announced a breakthrough in understanding the behavior of artificial neurons in neural networks. The TL;DR:
Most neurons are “polysemantic,” which means that the activation of one neuron can respond to mixtures of seemingly unrelated inputs. This can result in a hallucination, like a neural network confusing DNA sequences with the lyrics to “Lose Yourself.”
Instead of trying to understand individual neurons, Anthropic researchers realized that complex neural networks can be better understood by different units of analysis: the neural networks’ learned features.
These features provide the context required to activate certain combinations of neurons (e.g. one feature may activate neurons associated with DNA sequencing, while another may activate neurons associated with Eminem lyrics).
So, Anthropic built machinery that finds features in small transformer models. This provides a path to breaking down complex neural networks into parts we can understand.
Anthropic believes that this will be a game-changer in the understanding of neural networks’ inner workings.
Zoom out: The implications for LLMs are huge. Anthropic’s groundbreaking research means we might soon be able to better predict and control neural networks’ outputs—a huge win for LLM safety and reliability as we move towards enterprise and societal adoption.
Today’s Future Tools brought to you by:
Augment
How the Bold Do the Impossible.
Be present during meetings & never drop the ball. AI meeting transcripts & summaries across any application (Zoom, Gmeet, MS Teams, Facetime, Whatsapp...). All without a bot.
Read twice as fast. Have AI summarize newsletters, emails, documents, and PDFs.
Type less. Save time, not tone. A writing assistant who understands the audience: your team vs. your toddler.
Draw from your second brain & have AI write recap emails or reports without having to upload context.
🗞️ Other AI News & Articles
More AI-Powered Creativity by Adobe
After Canva and Microsoft released various AI-powered creative tools earlier this year, Adobe is stepping into the ring. The company announced three new foundational Adobe Firefly models for images, vectors, and design this week.
1) Firefly Image 2: Adobe’s Firefly Image Model has been around since March, but this upgrade seriously improves its image generation capabilities. The new Firefly Image 2 model was trained on 70% more image data, enabling higher quality visual generations and more realistic images. Designers are particularly excited about one new capability:
Generative Match. This feature allows you to better fine-tune Firefly’s output by referring to a reference image and mimicking its style
2) Firefly Vector: Adobe claims this text-to-vector model for Adobe Illustrator is the “world’s first generative AI model for vector graphics.” It allows users to create “human quality” vector images using text prompts. Each element of the graphic is then split into “logical” groups for easy editing.
3) Firefly Design: This new model generates customizable templates for print content, social media posts, videos, and online advertising. The templates are fully editable and are generated using standard text prompts.
All three models are currently available in beta—the final launch date is TBA.
Big Tech Is Bleeding Cash
With AI being the hottest topic in tech this year, surely the companies behind the popular generative AI products must be rolling in cash. The truth? Big Tech is still struggling to turn the hype into profits.
Every month, companies like Microsoft and Google are losing between $20 and $80 per user of their AI products, according to the Wall Street Journal. That includes subscription-based AI tools.
What’s so expensive? Generative AI models require lots of computing power, which is costly in terms of GPU shortages. They also don’t have the economies of scale of standard software products—each new query requires intense computing.
Looking ahead: AI models have a hefty price tag right now—but that doesn’t mean it’ll stay that way. As historically demonstrated by many sophisticated technologies (like cloud storage and 3D animation), technological innovation and growing economies of scale will likely drive down AI’s processing costs in the future.
Everything else:
This startup is tackling the practice of culling male chicks by revolutionizing MRI scanning with the help of AI.
Analysts predict that generative AI is in for a “cold shower” in 2024.
AI goes medical as Google announces new generative AI capabilities for doctors.
The G7 countries are planning to establish international AI rules and standards by Christmas.
A growing chorus is denouncing AI’s massive energy use.
📺️ This Week’s Videos to Watch:
More important AI news: Dive deeper into this week’s hottest AI news stories (because yes, there are even more) in my latest YouTube video:
Ready to dig into those new Adobe updates? I covered them in-depth (and even demoed them!) in a recent video:
Food for thought: Geoffrey Hinton, aka the “Godfather of AI,” talks on the promises (and risks) of AI.
And that’s a wrap! I’ll be back in your inbox next Wednesday with a fresh roundup of the hottest new AI tools. See ya then!
—Matt (FutureTools.io)
Interested in working together on a newsletter sponsorship, YouTube integration, or featured listing on the Future Tools website? I look out for the audience above all else, so I'm pretty picky—but get in touch here and let's see if we can make something happen.
If you haven’t already, be sure to follow me over on X. I like to share news and AI updates as they happen over there.
And don't forget to check out all of the newest tools we've just added on Future Tools!
You rock! See ya next week. :)
P.S. This newsletter is 100% written by a human. Okay, maybe 96%.