Playback speed
×
Share post
Share post at current time
0:00
/
0:00

Goat LLaMA Model Outperforms GPT4, Adobe Photoshop Firefly, and AlpacaFarm Simulation Framework

AI Daily | 5.24.23

Join us for another exciting episode of AI Daily as Conner, Ethan, and Farb discuss the latest breakthroughs in artificial intelligence. In our first story, we explore the remarkable performance of the Goat LLaMA model, which surpasses GPT4 and other models in arithmetic tasks. Discover how synthetic data and fine-tuning techniques contribute to its success. Then, we dive into Adobe Photoshop's new tool, Firefly, designed to revolutionize photo editing. Learn how this powerful tool can enhance productivity and transform design workflows. Finally, we explore the AlpacaFarm Simulation Framework, a game-changing approach to reinforcement learning using synthetic data and feedback. Find out how this framework can revolutionize the training of AI models. Don't miss this episode filled with exciting advancements in AI technology.

Key Take-Aways:

Goat LLaMA Model

  • The Goat LLaMA model, a fine-tuned 7 billion parameter model, outperforms GPT4 and Palm 540B on arithmetic tasks.

  • The use of synthetic data in training the model is an interesting approach to improve performance.

  • Fine-tuning a model specifically for arithmetic tasks raises questions about the nature of learning mental math and the relationship between language and mathematics.

  • The methodology of taking a small model and fine-tuning it to outperform GPT4 on a challenging task is significant and holds potential for future tasks.

Adobe Firefly

  • Firefly, a new tool for Photoshop, has been released and is expected to greatly enhance productivity for designers and photographers.

  • The tool, which was trained on Adobe Stock, offers a safe and reliable option for businesses to utilize generative AI technology without copyright concerns.

  • Firefly's integration with Photoshop is just the beginning, as similar advancements are expected in video and audio editing.

  • While there are other tools available for generative AI, Firefly and Adobe provide a trusted solution for professionals and offer unique copyright advantages.

AlpacaFarm Simulation Framework

  • Alpaca Farm Simulate is a simulation framework for training methods in reinforcement learning through synthetic data and feedback.

  • The framework offers a cost-effective and efficient way to train models for chat and fine-tune them without relying solely on human feedback.

  • Synthetic data and simulations can save significant time and resources compared to using real human data, making it a practical approach for training models.

  • Alpaca Farm Simulate demonstrates the potential of simulations and synthetic data in improving models, offering an alternative to expensive and time-consuming human-centric approaches in reinforcement learning.

Links to Stories Mentioned:

Follow us on Twitter:

Subscribe to our Substack:

Subscribe


Transcript:

Conner: Good morning and welcome to AI Daily. We're back. Another great episode for you guys today. I'm Connor, joined by Ethan Farb. Our first, our first story up is goat, a fine two llama model, which outperforms GPT4 on arithmetic. It's pretty small, 7 billion perimeter model, but it can outperform both GPT4 and palm five 40 B on a tasks.

Ethan, what have you read on this?

Ethan: Yeah, overall, I mean, a fine tune model trained on a certain task is gonna perform better than general. Um, so I think nothing too crazy new here. But they did have some interesting ways of attacking this problem. One is they used a lot of synthetic data, um, which we've talked about on the show before.

Really cool approach using synthetic data to, you know, put up all these arithmetic tasks and make the fine tuning process easier to actually. Perform better than these other models. But at the end of the day, I think, you know, we've talked about tool former on the show and using a calculator. This is, you know, at the end of the day you're fine tuning a model for arithmetic.

You're pretty much saying, Hey, can we make this model better at mental math? So I prefer, you know, using something like tool former with a calculator, but cool paper far.

Farb: Ethan un unimpressed by the, by the gravitas of this paper, and I can't disagree with him. It's very cool. I'll say the, the coolest part is that I think it falls on a, on a very cool trend line of, you know, continuing to teach these models more specific tasks.

You know, uh, I think the point that Ethan is making is that, yeah, you know, if you train it specifically on a very specific type of type of data, it'll outperform. Generalized models. Not to put words into your mouth, Ethan. Um, but, but we've seen that before. And I also think you're right that, you know, it's pretty cool that they're using syn, some synthetic data here, uh, to pull this off and.

We're gonna continue to see this happen across all sorts of specific data sets and, and, and knowledge domains. And to me it somewhat begs the question of, you know, what's going on exactly in the human mind more than just using your language centers to process mathematics. Probably, uh, it's interesting to try and take a language model and force it to do, you know, a type of math which, Arguably is a form of language in enough itself, but it's a lot less, uh, loose than, you know, most spoken languages are.

Ethan: Yeah, I think there's some merit to the meta question of like, you know, how do we learn mental math and how does that work internally and what's that gonna bring for, you know, other types of tasks in the mind to these models? So it's got some medical coolness, but yeah, I agree.

Conner: I think the methodology here was pretty important.

It's taking a very small 7 billion perimeter model and fine tuning it to be better than GB four on this pretty difficult task. So. Mm-hmm. That is pretty interesting to see and hopefully we'll see that in more tasks in the future. Yeah. Next up we got Firefly working in Photoshop and it'll be, Firefly came out I think about a month ago now.

Uh, very good tool. I've really much enjoyed using it, and I think we all saw it was gonna come to Photoshop. Next far. You saw, have you watched this? What have you seen?

Farb: Yeah, I've been checking out the demos. I'm, I, I didn't get a chance to download it and try to get access to the beta, but I, I plan on it. I do a lot of photography.

I'm actually kind of, uh, going through a, a friend's wedding that I shot right now, and I, I want badly to be able to play with some of these photos in, uh, Photoshop u using this tool. Uh, I, I use Luminar ai, which is a really fun, uh, photo editing tool. Pumped to see this in Photoshop. This is, you know, probably difficult to.

To, um, State how big a set of news this is for folks that use Photoshop. This is going to change the productivity of designers, you know, maybe reduced 90% of their workload on certain tasks. So, uh, this is just mind blowing and it's awesome to see the folks at Adobe do this and, uh, You know, this is just the beginning.

This is, you know, they're gonna, we're gonna see this in, in video. I think maybe they're already doing it, uh, with some video stuff. Uh, but we're gonna see this in audio, video photography. Just awesome to see.

Conner: Yeah. I'm more of an affinity man myself, but it's really nice to see this from the incumbents. Um, apparently they even said they're working on a compensation model for Adobe Stock contributor.

It's very interesting. Mm-hmm. Uh, considering Firefly was trained on Adobe Sock. Ethan, what do you think about this?

Ethan: Yeah, I think the main story here, you know, we've seen a few different tools do this. Uh, you have runway and you have a few other open source tools that can do this, but the main story here is that for businesses who actually want to use generative AI tech, Firefly and Adobe is really your go-to to avoid, you know, potential copyright claims, et cetera.

I believe Firefly worked with. I'm not sure if it was stutter Shutterstock or something, but they actually have all the copyright for their images and their Gen AI is, you know, quote unquote safe to use for all these kind of potential issues for businesses. So it's in a tool that, you know, tens of millions of people use every single day, and we're gonna see some really cool outputs from it.

Conner: Yeah, this is definitely the tool for professionals, professionals. Far. You said this is in beta now, would you say?

Farb: Yeah. It looks like it's in beta now. I think some people have access, and I assume they'll be opening that up over time.

Conner: Exciting. Our third story today is Alpaca Farm Simulate a simulation framework for methods that learn from human feedback.

So here they build a framework to train different methods and try out and experiment with different methods for reinforcement learning, reinforcement learning through human feedback. But the key point is here is that it's not actually human feedback. It's mostly synthetic data and synthetic feedback, but it's looks like a very good framework to test how we train these models to fine tune for chat.

Farb, what did you see on this?

Farb: You know, we talked, I think, yesterday about all of these models, you know, using GPT4 or GPT3 to do some part of the work to help train a model or, or, or do something. That would otherwise take a lot of time to code or to find data. So this was super cool to see exactly that they're creating a whole bunch of synthetic data training on it and, you know, doing in what they said, I think two hours and a couple hundred dollars, what might have been thousands or tens of thousands of dollars and weeks and weeks of, uh, human time to accomplish.

Uh, this is just the beginning of this stuff. We're gonna continue to see this over and over and over again. It's kind of interesting though, you kind of needed to do the, the first basic versions with humans so you could get these, you know, AI good enough so that they can be substitutes for humans. Uh, but, you know, wait for the super intelligent ais that are using, you know, Slightly dumber versions to get themselves tuned up, and then you're using the super intelligent ais to, you know, create the set for the next hyper, super intelligent ai.

It just keeps going. It's AI all the way down until you hit turtles.

Conner: Ethan, what did you see on this?

Ethan: I, I love it. Yeah. No, I'm, I'm absolutely in love with simulations and synthetic data and almost the ability to tear apart this like, logistics mode that some of these bigger companies have. You know, R O H F is such a good framework to make some of these models fine tuned to improve them.

And as the paper pointed out, you need thousands of dollars just to ro h f a small model. So being able to simulate this, do it with $200 and come within that accuracy of humans is. Very meta, but also very practical. Right now if you're trying to train some of these models,

Conner: I think some of the final, like final feedback you'd wanna use for Azure production model was best from real people.

But I think it's pretty key here that there's different methods to do reinforc learning with human feedback. Yes. And if you wanna try different methods, Al Alpaca farm is how you try different methods. You can simulate the data, you can simulate it at, you don't have to wait weeks or months to get that data from real people.

Farb: One hundred percent. I think it performed exceedingly well too. If I, if I remember the, the results from the, from the paper.

Conner: Yeah, very powerful. Okay, so what have you guys been seeing? What have you guys been seeing this past day?

Ethan: Yeah, I saw Google's, um, merchant center. So people who use Google and sell products, you know, there's millions of small businesses out there and they release what's called product Studio.

So using gen ai, if you're selling, you know, makeup, if you're selling furniture, you can now use Gen AI to mock up some of these background images. So, you know, we've seen a lot of tools out there, some people are using 'em. But now for all these small businesses to. Open up Google, like they always do, and go make some awesome product imagery and grow their business is great.

So we'll link it below.

Farb: Well, you know, hot, hot on the heels of, uh, Google's big AI show. Microsoft, I think today is doing, uh, build with their sort of re announcing all the cool AI stuff that they announced yesterday. Haven't had a chance to watch that, but I'm looking forward to checking out some of, some of that presentation.

They announced, you know, co-pilot for Windows. They announced a, you know, Cool bunch of dev tools for Windows using ai, waiting for, uh, the meta version of this. And, you know, Apple's doing a big presentation in just a week or two here. Uh, it'll be interesting to see if Apple, you know, starts using the word AI a little bit more.

Uh, they're incredibly, you know, well positioned to do this sort of stuff and. If internally they find that, you know, their reasoning that the market is gonna reward them for using that term and, and putting things in terms of ai, uh, we're gonna probably see some more AI from Apple as well.

Conner: Yeah, I wouldn't be surprised if Apple's like live transcribe thing we talked about last week would work very well in the headset, so I wouldn't be surprised to see that from them.

Ethan: Yeah. Uh, I wanna, I wanna throw one more in too. It's, you know, kind of AI related, especially on robotics side. Did y'all see Brad Adler's, uh, I think it's Framer or something, their new humanoid robot. Figure, figure, figure. Yes.

Farb: Yeah. They raised I think 70, 75 million. Yeah. Uh, they went from, I, I, I think they built the thing in a year or something like that, they said.

Yeah. Yeah. Super exciting. Here come the robots open. The AI I think has a robot, or maybe it's related.

Ethan: I think maybe  it's just, I think there's three of them now. Open AI has their one and then figure, and then Tesla of course is Tesla demoing some stuff.

Conner: So, yeah, Tesla had some more demos recently on their humanoid robot call it.

So yeah, super exciting. Yeah, I've been playing around with GitHub co-pilot chat. I got access to that earlier today. So on vs. Code Insiders, you can download the co-pilot nightly and they have the new chat extension. It works a lot well than how copilot previously worked where you had to do weird brushes and you had to like comment code and wait for it to maybe write the code.

Now you highlight some code, it'll rewrite it for you directly in vs code. It'll show a a get diff and even you can ask for new code directly in vs code. It'll give you that too. It's very nice to use.

Ethan: Maybe I'll open vs. Code again.

Farb: I was gonna say is is vs code your, your, your preferred environment? There it is,

Conner: but this is a private beta Ethans, I dunno if you have access to, Oh,

Ethan: I'm gonna stick with Rep then.

You know the real goats here,

Farb: this episode brought to you by Relet.

Conner: Every episode brought to you by rep. Yeah, sure. All right. Well guys, thank you. Another great episode today. Uh, make sure to subscribe on YouTube and have a good day. We'll see you tomorrow. Bye you guys. Peace guys.

0 Comments
AI Daily
AI Daily
Authors
AI Daily