Software², Superabundant Intelligence, and MiniGPT – Live and Learn #16
Welcome to this edition of Live and Learn. This time with an open source alternative to ChatGPT, an article about a paradigm that might lead closer to the realization of AGI, and a list of crazy things that people build on top of tools like AutoGPT and BabyAGI.
✨ Quote ✨
The existence of consciousness is both one of the most familiar and one of the most astounding things about the world.
— Thomas Nagel - (source – Mind and Cosmos)
🖇️ Links 🖇️
MiniGPT by Vision CAIR. An open-source implementation of a model behaving similarly to ChatGPT. This one is multimodal, which means it can also deal with images and describe their content very accurately. In this regard, it is like OpenAIs GPT-4, which promised these capabilities as well, but hasn't opened them to the public, yet. What I find fascinating about something like MiniGPT is that people are already building things that will soon rival GPT-4 but that also will be available for free to everybody. And because of that there will be an explosion in capabilities and use cases. Just like when StabilityAI open-sourced their StableDiffusion model. I am looking forward to that.
Software² by The Gradient. A long, but worthwhile read on another paradigm shift that will further accelerate the development of AGI. Right now, one of the main problems in AI research is the limited amount of training data. The internet is simply not big enough. However modern LLTMs can generate near-infinite amounts of new, high-quality data, which can then be used to train the next iteration of even bigger LLTMS. And so far this seems to work exceptionally well. Somehow GPT-4 and similar models generalize well enough that they synthesize new "insights" that can be used as training data for other algorithms. Automating training data like this makes it possible for AIs to have "self-guided" open-ended exploration of topics. Determining on their own, what to learn next and then doing so, with novel training data generated on the fly to fit the learning goal at hand. This way AIs could, eventually, add new features to themselves automatically. And this might get us even closer to true AGI really soon.
Intelligence Superabundance by NotBoring. A blog post arguing that people will create more awesome products when intelligence "becomes too cheap to meter" and that this will lead to a rise in jobs. There will be more work than before, not less, because the demand for intelligence is far from saturated–even with LLTMs on the market. Therefore creating novel and groundbreaking ideas and building them out into real world products with the help of AI tools is something that people will demand more of in the future. The economy that will be built around this paradigm shift will support this demand, creating more not less, interesting and creative jobs.
What People build on top of AutoGPT by Jay Scambler. A tweet storm with lots of examples of crazy systems that people build on top of AutoGPT. Every week we see new layers added on top of the underlying model architecture. GPT led to ChatGPT which led to AutoGPT, which leads to yet another layer of tools being built on top of that, right now. Better tools become the starting point for building even better tools. This leads to a Cambrian explosion within the tech ecosystem. And I wonder for how long this will continue and how fast it will ultimately get… maybe we will see new tools and toolchains emerge every day, even every hour?
🌌 Midjourney 🌌
🎶 Song 🎶
Riders on the Storm by Mezerg
I hope you found this newsletter useful, beautiful, or even both!
Have ideas for improving it? Please let me know.
Cheers,
– Rico