Playback speed
Share post
Share post at current time

Fundraising Frenzy | Playground AI | OLMo

AI Daily | 6.29.23

Welcome to another episode of the AI Daily podcast, your premier source for all things AI and machine learning. Join us as we discuss three significant fundraising events in the AI world, dissect the new mixed image editing tool by Playground AI, and explore the unveiling of OLMo, a new 70 billion parameter model by the Allen Institute.

Key Points

1️⃣ Fundraising Frenzy

  • Three large companies secure significant fundraising deals in the AI industry: Runway ML raises $141 million, Inflection raises $1.3 billion, and Typeface raises $100 million.

  • Runway ML focuses on video and general AI interpolation, inflection builds foundation models with a cluster of 22,000 H100 GPUs, and Typeface specializes in generative AI for content creation.

  • The fundraising frenzy in the AI sector shows no signs of slowing down, with global investment dollars flowing into AI and GPU-related ventures. Expect more news on fundraising and acquisitions in the future as the industry continues to grow and evolve.

2️⃣ Playground AI

  • Playground AI introduces a new mixed image editing tool, combining elements of Photoshop and Figma in a collaborative generative AI tool.

  • The image editing and content creation space is vast, but Playground AI stands out with its well-built product and the ability to generate images quickly.

  • Despite the crowded market, Playground AI's user-friendly experience, tutorials, and free access make it worth trying out for creators seeking better visualization and editing tools.

3️⃣ OLMo by Surge AI

  • The Allen Institute introduces OLMo, a new 70 billion parameter model focused on scientific research and discovery.

  • OLMo is an open model, with the Allen Institute planning to share every step of its development for future scientists and researchers to build upon.

  • Partnerships with AMD, Surge AI, Mosaic, and others aim to support OLMo's training and data labeling, potentially shaping the competition between AMD and Nvidia in the hardware space. The open nature of OLMo has significant implications for the industry and may attract startups and corporations looking to leverage open-source models.

🔗Episode Links

Follow us on Twitter:

Subscribe to our Substack:


Conner: Good morning and welcome to AI Daily. We have another great episode for you guys today. First up with a new fundraising frenzy. Three different, very large companies, uh, have raised a lot of money. Runway ml, of course, uh, is famous for their video and general of AI interpolation. They raise 141 million, largely led by Salesforce and Google.

And then we have, um, inflection who we've talked about twice in the past couple days. Who raised 1.3 billion from Microsoft, Nvidia, and quite a few others. And then we have Typeface, which is a sort of gener generative AI for work who raised a hundred million also from Salesforce and Alphabet Farb. These are three pretty big fundraising deals all on the same day.

Announced on just one of many days. What do you make of this?

Farb: I think we covered three other ones either yesterday or the day before, something like that. The feeding frenzy is not, uh, coming close to ending. You can get used to us talking about fundraising stories for the next however many years, and then acquisition stories, and then more fundraising stories.

So it's great to see. There's some awesome teams. Working on some pretty impressive things. Uh, the global investment dollars will be chasing ai, uh, and GPUs for, you know, the foreseeable future. So get buckled in and get ready for more.

Conner: Just one of many. Ethan, you have any thoughts on this?

Ethan: Yeah, this is a big day for a lot of these companies.

You know, the big important point here to me is these are three fairly different types of companies. You know, typeface competes on kind of a Jasper esque round for this kind of content creation, you know, traditional product, face business. They've been around for a few years really using, you know what?

People have been using GBD three for, for a while now. They've embedded themselves good business. You have someone like inflection at a true foundation model level. Building a cluster of 22,000 H 100 s, probably spending half or more of their round just on that cluster. And then you have someone like runway sitting a little bit in the middle, building their own models, building a product.

So you know, you're seeing these big fundraisings across AI, across the entire stack from the foundation model level, all the way up to these kind of more mature products. So we're gonna continue to see it, and the frenzy continues.

Conner: Yeah, it's a very good way to split it. Runways, of course, mostly just a research lab.

A lot of what their work is is the research itself. They have some products, of course, but mostly research. And then inflection is again, just building a foundation model. 1.3 billion. Most of that go Ps. All the way over to just another generative AI company that is doing good work. So congrats all three of you guys.

Next step, we have Playground AI they have a new mixed image editing tool that is essentially a lot like Photoshop and the Photoshop generative AI that we've seen so far. This is sort of a mix between the Photoshop and the Figma, so it is a collaborative generative AI tool. Another good tool, Ethan, any specific thoughts here?

Ethan: Yeah, it's a massive space. You have Photoshop doing it. You know, we covered clip drop the other day and playgrounds, you know, props to them. I do believe they build a lot of their own models, um, and they piece them into a very well-built product. So it's a massive space at the end of the day. Image editing, content creation.

Creating new forms of just visualization and giving better tools to these creators. So they're playing in a massive space, they're doing a good product, and they're keeping up to date or even faster, up to date than some of the Photoshop players or stability themselves from a clip drop end. So yeah, props the playground.

Conner: Yeah, it's a very massive space, but it's also very crowded space. Far do you see any of these winning out? Do you see them all winning out? What do you see coming.

Farb: Uh, I spent a bunch of time playing with playground ai, uh, yesterday and today I was pretty impressed by it. It's, it's, it's pretty fast. Uh, it generates the images pretty quickly and they provide just enough productization to allow you to kinda get started without having to just have this empty prompt that you're, you know, is the only thing that you're starting with.

It's not like working in, um, in Google Col, col collab. Uh, it's, it's a, it's a pretty well productized, um, experience and, uh, I, I say try it out. It's free. You just need to log in and you can start playing right away. And they give you a lot of tutorials and a lot of help to do it.

Conner: Very exciting. Lastly, today we have, uh, OLMo, a new 70 billion parameter model by a, the Allen Institute, or AI two as it's shortly called.

This is a model largely focused around scientific research and scientific discovery. The Allen Institute is, of course, a nonprofit that's done a lot of research in ai, and this model is gonna be completely open. They've said, they said every step of the way of how they make it will be open for future scientists and future researchers to build upon that work.

They're partnering with everywhere from AM MD on a hardware side, all the way to surge AI and Mosaic, and quite a few partners here, but focusing on an open model in the end. Um, Ethan, you have any thoughts on this? What do you think about it?

Ethan: Yeah, Surge is helping on, uh, data labeling side. You know, you have the Allen Institute for ai, also just great researchers there.

You know, the biggest news to me was that they're actually trying to train this on a m d chips. I'm very curious to see how that turns out, what model they're using, what the architecture ends up being, um, how long this takes them too. I know they're targeting 2024. Um, we'll see if that's q1, q2, what that ends up being, but.

AMD needs to get their feet into this water. They've been getting absolutely annihilated by Nvidia right now in terms of everyone's trainings, every cluster being built. So if AMD can pull this off and show that, hey, a 70 billion peram open source, L l M was built on AMD chips, I think you'll see some actual real startups in corporations kind of possibly redirect their focus, especially with this chip shortage.

So for me, the biggest news here is probably if AMD can pull this off, this is great for them.

Conner: Yeah, AMD is pretty lagging in the hardware space, but if OLMo does this right, if OLMo opensource everything in a way where others can follow it step by step far, that'd be pretty big for the war between AMD and Nvidia.

Don't you think? Sorry, what's that? Uh, like the contrast between a MD and Nvidia, if OLMo pulls this off in a way where everyone can follow step by step. How do you think that changes where people build their models?

Farb: You know, this is probably bigger news for AMD than it is for either either of these other two, uh, which is, you know, by proxy potentially huge news for the, for the whole industry.

Uh, these folks are pretty, interestingly enough, super focused on science and academia. They wanna, uh, even provide education for researchers and scientists to build their own. Models based on all of the open stuff that these guys create, the data will be open, uh, the curves will be available, uh, ev everything they, you know, they plan on making everything available.

Kind of interesting that they're, you know, targeting next year. Um, who knows what the competition's gonna be like when they actually release the thing, but you know, you gotta start somewhere.

Conner: Yeah, all the, all the open training and all the open knowledge right now of how to train a language model is build from video.

So this is, I think, AMD's second big partnership after hugging face. So we will, we'll see where it goes. Well those are three stories. What have you guys been seeing, Ethan?

Ethan: Um, yeah, it's all Eric Hartford. Um, great AI researcher in the space. He's announcing Open Orca, so it's a data set inspired by Microsoft's Orca paper, which we covered a little bit back, pretty much open Orca is the entire data set.

A bunch of data. They want a compute sponsor for different size models. I believe seven or 13 billion is some of their initial targets. So if you wanna model the train and you have some compute, you now have a new dataset and a new way to train models, so check it out.

Conner: Very exciting. Farb, what about you?

Farb: Uh, a good friend of mine who's a, uh, writer and film producer showed me pseudo writer, uh, yesterday, which is sort of an AI powered platform for writers. Uh, they had, you know, it's early, you know, it's not a fully figured out product. It's running off of GPT3.5 and, and four, if I remember correctly.

And, you know, their opinion on it was potentially useful. They hadn't dug into it super, uh, super heavily yet, so, Who knows, it may be more useful. They had a few little critiques as a, as a writer, that they thought could have improved it. But it's great to see folks getting into these sort of more niche spaces and bringing AI to just about everything you can think of.

Conner: Very, very exciting. Yeah. I saw Salesforce introduced their X Gen seven B, which is their 7 billion parameter. LLM. Trained for 8,000 sequence length. Um, normally this would be a main story, but between the countless other 7 billion parameter models that we've seen, and between Salesforce's own investments from runway to type force, the story was kind of eclipsed and.

It's not that big of a story when you can see MPT Falcon Llama Red Pajama, open Llama Dolly. There's a lot of 7 billion parameter models out there now. So if you're a company training a model, 7 billion parameters isn't that newsworthy anymore? Honestly. Sorry guys. Do be better. Be better. I mean, in all fairness, they probably started training this, you know, a couple months ago when there weren't that many other 7 billion prem models out yet, and.

They just happened to release it late to the party. So better late than never. Better late than never though. Well, great show today guys. Thank you all for watching. Make sure to like and subscribe and see you guys tomorrow. See ya.

AI Daily
AI Daily
AI Daily