This space was downloaded via spacesdown.com.
Visit to download your spaces today.
Hey guys, so you guys hear me all right?
Can you hear me?
Yeah, I think we can start.
I'm ready.
Okay, awesome.
All right.
Are we good to announce?
Yeah, why not?
Let's do it.
Okay.
Cool.
Let's speed it first before making the announcement.
Yep.
I think they're not expecting the way we're going to announce this.
Insanely, insanely great value for both perplexly users and rabbit R1 holders.
So, yeah.
Yep, yep.
All right.
Let me just go ahead and read it.
All right.
All right.
So, we're going to start a conversation in a bit.
But first of all, we will both perplexity and rabbit official account will tweet about the announcement for the details.
So, we're ready when you are.
Let's wait for a tweet and then we can start a conversation.
Okay.
I think it's done.
So, I'm pretty excited to share that perplexity and rabbit are partnering together.
So, we are excited to power real time precise answers for rabbit R1 using our perplexity online.
Yeah, it's that have no knowledge.
Good off.
Just always plug into a search index.
And the first 100,000 rabbit R1 purchases will also get one year perplexity pro.
I'll be where that came from.
I didn't know that spaces have that feature, but yeah, continue.
Okay.
Yeah, the first 100,000 rabbit R1 purchases are going to get one year free of perplexity pro.
Actually, perplexity pro, one year free is 200 bucks.
So, if you paid 200 bucks for purchase of rabbit R1, you're getting twice the value.
Yeah.
So, we had an interaction on X a couple of days ago.
And then what's going on next is the following couple days teams being working really hard together to make this happen.
And I think, you know, to me, it's a no brainer.
If you think about, you know, rabbit R1 would price at $199.
No, actually, not $200, $199.
No subscription.
Sure, no.
And then perplexity, Aaron, is generous enough to offer a privacy pro for a whole year.
That was actually 200 bucks.
So, you know, that's one extra box on top of it.
So, but I want to share a little bit, you know, background.
What's going on?
Because to us, you know, we've been huge fans of perplexity from day one since the launch.
You know, even though I haven't got a chance to talk to everyone personally.
But, you know, we are pretty early start out that we started trying to test their APIs.
So, we are their customers, too, anyway.
So, I think what's most impressed me about and, you know, quite ironically,
it's the same reason that why we are so disappointed with, you know, other,
oh, and so on there, you know, for instance, you know,
and I think that they did very, very poorly on what they call web browsing or up to date search.
And that's when we realized and we found perplexity and we immediately did a test round.
That was like many, many years ago, sorry, many, many months ago.
And the results were insanely good.
It beats all the other guys out of water.
Even though the way that we connected and we established this partnership is totally new.
But I think it's determined to happen, you know, at least from my perspective,
because our angle is to bring the best experiences to this little piece of device.
And, you know, we choose whoever is the best in the industry.
And I think right now, up to the information search and general knowledge, you know,
I think perplexity is definitely the core choice.
And I noticed on the Twitter thread as well, you know,
a lot of people are replacing this, replacing the Google search with perplexity.
That's certainly also my case.
So I just want to talk from, you know, being a customer myself, being a user myself,
that's my angle.
Why this deal makes perfect sense.
Yeah.
Yeah, thanks a lot, Jesse.
So actually, a lot of people might not know here.
The way the whole thing came together was, I think I posted something that's fun with the rabbit.
I think everyone on Twitter was just taking your rabbit device and like posting their product on it on the screen.
And then you looked at it and you were like, let's work together.
Coated me.
Then I coated you and then you just made it official on Twitter.
So we never even actually exchanged emails or anything of that nature or got interest in the traditional way.
The whole thing came together just by people being excited for us to work together.
So what do you have to say about that?
And I guess you're also like, you know, going viral on Twitter or rather you should call it X here.
And how, you know, you've grown so, so fast, like so quickly.
So how are you thinking about all this stuff?
Yeah, so yeah, we're, we're in fact a little bit over as long as couple days.
We saw basically rabbit R1 every day.
It was another 10,000.
So right now as we're speaking, the fifth batch, which is the first 50,000 units are already being sold out.
That means that we're, you know, 50% already at the top 100K.
But what I also want to share a little bit.
So that's like 10 million dollars in sales.
Yeah, 10 million dollars in five days.
It's okay, I guess.
But I think, you know, what we are currently doing is that, you know, now we have a pretty good understanding of the demand.
We already pre communicated with our logistics, you know, regarding to the OEM OEM.
And want to make sure they catch up with the speed.
So I think, you know, after this 50,000 units, the future will no longer distinguish in batches.
What we're going to do is that we're going to basically work with the OEM OEM and our own internal hardware team to catch up with the speed so that, you know, we are no longer facing this, you know, one month, one month stripping.
So the goal here is that hard date March 31st, which is Easter, we're going to start shipping the first batch.
I'm going to do our best to ship all the future batches as early as we can.
And that's the best we can do at the moment.
But I think, you know, hopefully, you know, our partnership will further incentive that because really if you just do simple mass, you know, $100, $109 and $200 back.
That's not too hard to understand, but, you know, more importantly, it's not about the money.
It's about getting the best experiences in a very, very, you know, well designed and intuitive device.
With, you know, not only what GTs can offer, what the other devices can offer, but with large action models running behind.
So I think it's quite promising. And speaking of the deal, I think, you know, I'm not sure if you agree with me, but it seems like we accidentally invested a new type of deal making that happens on Twitter with quality to it.
I don't think any, you know, major players have ever done that so fast and have ever done that in such manner.
So I want to hear what your thoughts on it almost starts with the meeting and then quickly make it official and we quickly make it work on our devices.
And then here we have the partnership announcement. So what's your thought on that?
I mean, we love working with you because you move incredibly fast. Like the moment you told you want to work together immediately, we started a Slack channel and then your team started hacking on this and you sent me a demo.
And then, you know, we got energized looking at the demo. We were like, hey, like these guys work fast, like as fast or even faster than us.
And we used to think we work fast, but obviously I'm seeing you, we think we can do even better.
And then we want to make our APIs even faster. So maybe I want to leave from there to like, you know, your thoughts on the whole voice to voice form factor.
Because the rapid device is not, it's definitely taking us beyond just consuming screens and text in the form of pixels to like just interacting more naturally.
So what are your thoughts on like the next stage of how people consume and interact with all these AI chat bot and assistance?
Yeah, so I think, you know, being our age, you know, we grow up unfortunately where the dictation engine were never invented and then it was invented and it was putting use in a horrible way that I think our current generation are victims of the early days of the day.
Of the early days of the dictation engine, early days of the natural processing before large model, of course, and transform and all that.
So I think, you know, me personally, I identify myself as probably along with everyone here is PTSD with the with the early version of dictation, dictation engine.
That's why I guess, you know, it creates such a strong impact on our mind that OK, maybe voice is not right way to go and rather prefer type.
But I think, you know, our principle is very, it's very simple is that what's the most intuitive way for communication, right?
Like think about everyone if we convert this Twitter spaces into a type Twitter spaces or into, you know, even worse, like a facts Twitter spaces, not non instant message to their spaces.
I don't think that, you know, we can deliver all this information in a rather short period of time.
So, so, you know, if you think about how human communicates with human and, you know, before the, I guess before the Newerling stuff becomes put in use and natural language, especially conversation in voice is still to be the most efficient way.
Now the problem becomes easy because we just need to fix the PTSD.
But I think if you if you look at the past three, four years, probably like especially past past three years, a lot of the fundamental infrastructure around that has been significantly improved to where, you know, the younger generation, especially I'm not sure if, if I'm not sure if, you know, how many of the listeners here got like probably like five year old, six year old, seven year old kid.
But the younger generation that they were born like, you know, after after 2010, I see among all the kids that, you know, they actually prefer the dictation icon on the keyboard rather than start typing.
So, I think that the use behavior in a different generation is already start shifting and, of course, the fundamental reason is because of love is infrastructure are good enough are redundant enough.
So, for us, you know, we are not saying that you can only talk to our one if you shake the R1 keyboard will pop up.
But if you think about the most intuitive way, and if you're in a rush, there's nothing better than just find that annual button press and hold and start talking.
So I guess that's the our design principle, you know, we understand the current challenges of difficulties.
But we want to push just a little bit further because the method is not wrong, right? The approach is not wrong.
It feels wrong because the technology won't ready, but I think in the past three, four years, a lot of infrastructure has been significantly achieved to where we're confident now enough to build, you know, device like R1 is for factory.
Yeah, that's that that's cool. And like, I also want to like, you know, lead you to the follow-up question there, which is, what do you what do you think about today's latency and what do you think the latency for voice to voice is going to look like maybe you know, like six months from now or you're from now.
Yeah, I think that's a major like major major task here. I was actually talking to our engineer team. I think we should have a special force team just to focus on the latency on all the features, not only on search features, but on all the features.
I'll share a story. So I built an app like many, many years ago called Music Flow. It was primarily launching Asia because we have partnerships with music labels back in Asia. And it was not launched here.
And the app is extremely, extremely simple. It's just the entire screen is about to impress and pull the talk and the music starts playing. And we did a lot of, lot of, lot of work in the back end to make sure that music play starts playing the first piece starts playing within a second, 800 minutes and to be exact.
And that, that I've just asked you to go viral because of the speed like we didn't create any additional features compared to, you know, traditional Spotify or traditional Apple music, cool music.
We just did that, you know, voice plus sub seconds response time.
And another thing we did is that, you know, we, we, we basically use natural English processing to parse every part of the lyrics so that you can literally just, you know,
run more part of the lyrics and still matches the song. So that's basically all the things we did. But I think, you know, hugely benefits from the sub seconds response. And that lesson, you know, keeps my heart so hard. And then we, we kind of like when we design a rabbit OS sub seconds is always our goal.
But if you look at a reality right now, we can, we can, we already, I'm sure a lot of people see the demos and post on the, on Twitter and on Discord, you know, I demonstrate to play a song, you know, that's on par with sub seconds latency.
And I think in reality, you know, search web browsing, search up to the information and also the some of the vision features like the GPD for vision provides and our own, you know, vision features provides, you know, sometimes it takes two seconds, sometimes it takes more than that.
And vision probably, you know, the first time I showed probably take like 15 seconds and we have another version that now currently take about seven seconds.
So those are things I'm not happy with, but I think, you know, latency shouldn't be feature by feature latency should be a universal bar for natural, natural language, you know, interaction with any devices.
And I remember I read a paper, I probably can find a paper later on and post on my Twitter, but there's a research on human brain, understand and handle natural language needs wise. I think, you know, there's couple, couple of the different languages and they can differ very, very differently.
Like I think Japanese, as well as a couple of other language, you actually process that information faster because of called the language is structured, but I think 500 milliseconds should be like the golden bar.
Like I did a couple of tests internally, I think 500 milliseconds, natural language voice response is the golden bar.
I think you don't want to be slower than that, but you don't want to be faster than that.
Be faster than that, people stare off because they think that, oh, you're pretty soon and you're scared.
Like you answer my question ahead of time, but if it's slower than maybe like, you know, 800 milliseconds and people will confuse and try to, you know, ask the same question again.
You know, knowing that maybe it's a network issue or maybe you didn't forget the voice, right?
So I think 500 milliseconds is kind of like a universal benchmark that we're trying to hit on among all the features.
That's kind of like my.
And yeah.
And everybody you consider your, you know, whatever latency you have today, how do you compare that with, you know, other similar apps like chat GPT voice to voice?
Have you have you tried comparing?
Yeah, so we did have a technology we call kernel that we start working on this pretty early or more than two years.
That would basically establish a streaming model because if you think about why there's a latency.
So if you press this button, the microphone starts recording and you're recording in in an audio file.
And that audio file needs to be converted into strings and strings send it to the dictation engine or TTS, text to speech, sorry, speech text actually speech to text engine and convert to text and then that text send it to.
Or maybe I try to be safe, or perhaps if you are whatever large under model for intention understanding and then a start generating based on their speed.
And then it's a round trip, right? This is a single trip and everything repairs again.
So if you add all this together, if you just go there and build a voice AI with no optimization based off GPT 4, we know for a fact that single dialogue you're looking at probably five to six seconds.
But we made a streaming model to where we basically cut off the chunks into a very, very small timestamp chunks and we make the entire model streaming.
I think I'm not the best guy to talk about this. Maybe our CEO later on can write something about this, but we do have a technology to make the sequence into a streaming.
We're not necessarily accelerating to T or practices speed at the moment, but with this streaming mechanism that, you know, if you ask non-search up to date information, where we're constantly hitting the benchmark, which is 500 seconds per response.
But I wish, you know, everyone I wish our team you and me can do something just just on up to date information search.
This space was downloaded via spaces down dot com. Visit to download your spaces today.
Maybe we can push this part of it because again, this is whatever we're going to push, this is going to be industrial standard.
Because right now, right now this, it is what it is.
Yeah, absolutely. Yeah, we are certainly at the cutting edge here and in fact, like the fact that you wanted to do it through streaming that already makes it much.
Like the perceived latency is already a lot better than waiting for the full response.
And, you know, I think there are so many more things we can do to speed it up.
So I also want to talk to you about, you know, you selling 50,000 units in five days.
There was a tweet from Daniel Gross, very said for context, the iPod sold 125,000 in the first quarter.
And iPod is considered a historically successful product.
And the iPhone sold like 270,000 in the first 48 hours.
So it's obviously like, you know, our one is way more looking at the pace, the trajectory.
It's likely going to be the iPod for the whole quarter.
And what so obviously you're, you know, there's going to be high expectations for the next device you're going to make.
And so what are your plans for our do or for their devices?
Yeah, so, first of all, I think, yeah, I didn't quite know that number until I saw that tweet.
And that makes my head spin for a sec.
But I also realized that, you know, we're talking about no internet, no Twitter versus Internet.
I mean, we're at a different pace of speed, right?
So I don't think we're all performing iPod at the moment.
We're very, very young startup.
And this is our first generation device.
When Apple launched iPod, they have a tremendous history of setting up the bar and make everything iconic.
Pretty much everything they did, you know, sets up bar.
So first of all, I'm not delusional just by the numbers that we're getting for the first couple of days.
We're not delusional that we're arrogant enough thinking that, okay, this is better than iPod.
It's a self-performance-wise because, you know, the time is changing and they don't have social media whatsoever.
So I would assume if Apple has social media back then and probably going to be way more than that.
Because from that angle, you know, to your question and to your point is that we are...
I don't think anyone is certain of which phone factor will be the best.
I just saw Mark Zuckerberg had a Twitter today about...
He talks about open source of Lama 3 that they were developing.
And on the end of the mini clip, he said he believes that glasses are the best phone factor because he sees what you see and listen to.
I don't quite agree with that.
I think, you know, no one has figured out or no one should have confidence before they even test it on the user's hand and say, okay, this is the best phone factor.
So the hardware decision of R1 is not pushing the edges on all the components.
But the other way around, quite contrary, is the other way. It's a result of de-risk.
Because we are new and the operating system are new, the entire AI stuff is new.
And we don't know what's the best phone factor either.
And plus, there's a lot of latency issues, of course.
So we don't want to risk presenting a completely new way of interacting with devices.
Plus, give you another kind of like insecurity of, oh, this is completely new phone factor and you have to learn from the start how to use it.
And that's why we designed R1, we think about, okay, we need to present something that you already familiar with, but in a new way.
And that's where I extremely, I'm so happy that communities are getting it.
Because the first thing I did is I bought Night Tomacochi for my entire design team and engineer team back then.
We're even smaller than today.
But it was about a year, a year and a half ago.
I bought Night Tomacochi from Amazon and I send them as a dish and I'm like, hey, this is what we're making.
And then obviously, you know, we all love the pokey decks and they have yet made the pokey decks yet.
So I think, you know, we want to bring that culture, you know, relevance to you.
Plus, we know that the most efficient and highly redundant phone factor of voice dictation and voice commands and voice interaction is not a wake word.
I don't like wake word because I was I was at one time in a meeting with couple of Amazon folks.
And and obviously we're talking about Alex and obviously they keep saying Alexa and the entire meeting become a mayhem because they all have a Alexa speaker around.
And then all whole bunch of Alex are start talking each other and and that's very very dumb.
So I don't I don't want to wake word and then what if you don't have wake word and you think about what's been what's been already in use, right?
And then I think I have another example.
I'm like, okay, think about what's the most crucial scenario that you need to use voice and voice only.
Then we think about the, the, the, the, the, the, the, the, the, the soldiers and the military folks.
They're either fight or dying, but they all have this very redundant design.
One button, no, nothing, just one big button right next to their chest or right on their headset.
You just push and talk and you you're not pushing and waiting for an activation sound and you start talking and then automatically text when you're off, you have to push and hold it.
And then on civilian market, yeah, it's even more common. It's called bogey talking.
We all know that and we all have that.
So I guess that's why we combine all these elements together to, to put into sauce.
And that eventually set the phone factor of R1. We want to offer you a modern version of Tomapcochi and pochitex power by AI.
But with a very redundant push talk button that you probably already know how to use it.
That's very insightful. Like I didn't, I didn't realize that your rationale for having the button was to, you know, not get into this problem where there's a whole room of people using a bunch of devices and all of them are calling them out by the name.
So that, yeah, that makes a lot of sense.
And so look, I mean, the other thing, you know, it's obviously, you know, we've been sort of building this in public.
I mean, first of all, you've been doing it way, way more actively than pretty much anyone I've seen so far.
Like to extend that everyone on X is like a supporter of your company.
So what, what did you learn the most? What surprised you the most about all this attention you've got and all this customer demand that you've gotten.
And user love that you've got so far.
Yeah, yeah, that's a great question. Actually, you know, a couple of things.
First of all, this is not my first startup. You know, I, I, I've, I've dedicated my entire career, building ideas on my own.
I think this one in particular with a journey of rabbits, which we just started, is that what I learned, you know, probably this is something that I learned in the past weeks, you know, to be honest.
I didn't expect how tired of, of people towards their existing apps and existing phones, you know, I, I know I want to get rid of the apps.
And I know iPhones are boring, but I don't, I don't expect to start such great waves of resonance that, you know, people are accidentally designing you phone factors for us.
That's what you see me post on Twitter with supports.
People are designing watches, people are doing all kinds of different things and different colors of the hour.
So I think that question surprised me because I think, you know, maybe it's going to be very hard to convince people that this is new and you have, you know, we need to figure out a magic for people to love it.
I think that, that's, that's quite surprised.
And another thing is that I think color played a lot of, a lot of roles here.
I would say that if we design this to an Apple color scheme or, you know, mainstream color scheme, say if we use just math, black or silver, I wouldn't, I wouldn't expect the R1 can be as iconic in design industry as so quickly as what we are experiencing now.
I think color plays played a lot of, a lot of factors on helping us to get our, you know, images and phone factors out there.
But, but to myself, I also learn quite important lesson is that, you know, no matter how successful you are and how, how well educated and experiences you are, I mean being a startup is a reset.
You know, if you want to do a startup, it has nothing to do with whatever you did before and it has nothing to do with your past, past, you know, titles or whatever that is.
It's always a hard bootstrap. It always starts from beginning.
And I think, you know, that's why, you know, I try to put myself as transparent as active as possible, even though that sometimes is quite exhausting.
But I think, you know, we position ourselves, if you look at our teams, you know, amazing team, it will achieve a lot of things.
But, you know, we, deep in our heart, we understand where a new startup, where, where a new startup is our very first product and nothing needs to be prejudged.
We have to, you know, act like a startup and catch up with the startup speed and be the fastest guy out there.
And at the same time, be as transparent as we do.
Speaking of that, like, what are some other AI products, you know, out there that is that, you know, interesting you and exciting you today.
Yeah. Well, I like just to, I like just to test new things, you know, I guess you can categorize myself as like a stupid leader that's there.
So I was actually in our protest group back then, I was actually one of the first, I think I'm probably the first 200 guys to ever get hold of Google gas back then.
And obviously, I'm well, the early August DQ1 purchasers.
And obviously, I support Eric at Pible.
If you guys do remember Pible, the first eating smartwatch.
So I tried a lot of new things, I could try new things.
I probably set up alarm on Friday to order the vision pro as well.
But I think, well, when we started rabbit, there's obviously that, you know, there's humane.
There's Abby tab.
There's, you know, meta glasses.
There's new generation of Google AI or whatever that is.
But we don't, the way we, the way we position ourselves is we first position ourselves as users.
I first ordered a humane app on day one.
I'm still waiting.
And I watched the entire talk from Abby from Cab.
I learned a lot, you know, like it's quite a different perspective that everyone is working for the same goal, which is, you know, that's the, that's the thing I think we all got excited.
So the thing is I put myself into a user perspective.
And if you put yourself in the user's perspective, you kind of like, you know, because you are also doing the same thing, you're making a new device with AI.
That eliminates a lot of your personal egos and that eliminates a lot of your, you know, pre judgment or, or, or misunderstanding.
But really just to put myself into user's perspective and make relevant decisions, because if I put myself in, you know, in a user's perspective.
And knowing that the current level of AI, I'm not sure if I'm going to pay 800 bucks plus monthly subscription.
I'm not sure if I'm going to just get rid of the screen completely.
And I'm not sure if I'm going to wear a glass.
I mean, I paid 6,000 bucks to get a laser to get rid of my class.
I'm not sure why I'm going to, I'm going to put a glass every day to work.
So, you know, we put ourselves in users perspective and I interview with each of our team members, like what do you think about this?
If you're a user, we got a lot of insights.
I think that's the best practice anyway.
But again, for the record, I like competition.
I think competition is great.
If you have competition, that leads to innovation, that leads to honesty.
And ultimately, with competition, everyone is going to make their product better and much more affordable and accessible.
All in all, competition benefits customers and audiences, right?
So, we love competition.
You know, that pushes us move fast.
Yeah, I think, you know, put yourself in users perspective. That's more important.
And do you have any thoughts on, you know, in a imaginary future where series is able to do a lot of things, voice to voice, with generative AI more natively.
Do you think the phone phone factor is still, like, amazing then or there's a lot of people who are skeptical about, like, why do we need another device?
Yeah.
If eventually Apple is going to blast a new version of series to everybody.
Yeah.
I'm sure that's going to happen.
I'm sure we accidentally accelerated these guys to make themselves push a little bit more aggressive than they're currently going.
There's no way to avoid that, right? So, like, not because, because they, they're going to do what they're going to do.
And my honest opinion towards that is that, I've, I've known enough people within this big companies to learn that, I think general public don't get it.
Sometimes, you know, we saw news that, oh, Apple is doing this. Apple is doing that.
And Google is doing this and Google is doing that.
But if you really knew a couple of folks there and they'll tell you, oh, they're just a small team of 10 people with a limited budget of two million.
You know, that's what's really is.
So for big companies that they have this massive established ecosystem that changing anything is going to be a very, very, very hard sequence.
So, I think, yes, you can wait.
But it's going to be a long time. It's going to be a very long time.
I don't personally foresee how Apple can just based on the current experience of the iOS, you know, following the current incentives of the apps to, you know, basically encouraging people to build one app for one thing.
And all of sudden, revamps everything and makes something like a large action model.
And, and same to other companies. But however, I think, you know, phone factors, yes, I don't like to carry two devices.
I personally don't. I think no one should. If you can carry one device, you should.
But I think a lot of people are not holding an R1 in our hand.
And if you hold your R1 in your hand, you're surprised how light it is. It's 110 grams.
And try to, try to go to your fridge and pick two eggs. That's it.
And then, and then how small it is because I have a ridiculously small hand. So I think, you know, my demo is not a good reference.
I just met another guy that like, oh, you did this wrong because you have small hand. People don't get it how small it is.
It was, it was, it's a, it's, it's an exact same footprint of your iPhone Pro Max with, with, with, with wise.
So 15 people, people said that even for the Vision Pro ad where they said Apple, big people, not with like big heads.
Yeah. Ideally, you want to pick people with big heads because the Vision Pro had said it looks so massive.
Right. I don't think it really matters.
Yeah. But I think, you know, like I said, no one, no one, no one, no one tends to compensate.
Oh, hey, this is the best phone factor. I personally think that phone is, is good because it's just a little piece of glass with basically edge, edge screen.
That's the current generation phone is.
But phones are relatively easy to make, not from engineer wise, but purely from resource wise.
There's a lot of reference designs, you know, a teenage engineering, which I said part of help to strap nothing, they tore nothing with, with cow pay.
So, so we know the phone business quite well.
Phones are relatively easy to source if you're talking with cars and everything and everything is more established.
But I don't think that we should just present a phone for the first generation. We rather do something small, cheap and fun and slowly evolve to that.
So there's no, there's, there's no guarantee that we said, oh, we're not going to make the phone.
We can, but I don't think make the phone for as first generation is a good choice.
But for those companies out there, I think they started to try something new, you know, like we see matters start trying glasses and all that different stuff.
But all of all, I think, you know, hardware is just a vehicle. It's a phone. It's a vehicle that hosts what's inside and the software experience matters.
What's truly renovating about rabbit is not about this $100, $100, $100, $199 piece of hardware.
That's the vehicle to host lamb and rabbit OS and good services like proprietary, I think the software experience matters.
So to us, it's almost like Apple, you know, like we figure out lamb and lamb is so good that we have that and we quickly realized that this cannot be easily developed as an app easily developed as anything out there and running on their platform.
So that we decided to do this, this hardware. And if you think about Apple, iPad was actually before iPhone.
iPad was the first project in this current generation. Now iPhone was actually later on, but I, you know, Steve first wants to build a glass, just a pad.
So hence the iPad and Steve asked the team to research on how to work on it, how to interact with it and the Apple engineer presented multitouch.
And the multitouch experience is so good that Steve actually postponed the iPad project just to the iPhone straight away.
I think lamb is, it gives us a similar feeling, you know, like we know lamb is so good that we don't want to just build an app. We don't want to just build a website.
We don't want to build just, you know, on the system that is not controlled.
It's not specifically designed by the better lamb. We would rather take a little bit risk to do hardware.
But phone factor wise, like I said, we did risk that. We did risk everything. And R1 is a result of the risk.
But no one knows the correct phone factor. I have to say that. The market will tell.
Speaking of that, you know, like all the market that you've seen so far, the 3000 people that have ordered your device and hopefully like a result thousands more.
What are you seeing as use cases people are excited about and like, how much of that requires something like real-time live information, the kind of information that would actually provide you.
Yeah. So for me, you know, I mean, I've been carrying R1 earlier version in the prototype. And now I have the ready version.
I've been carrying R1 in my pocket for probably past six to eight months.
A couple of the scenarios I really, really don't need my phone is that, first of all, search.
I don't think the current app on your phone or website are giving you even close reasonable answers and accurate answers and process answers.
I guess that's that's one of the biggest reasons I've practically become.
Because the search wasn't just good, but there's a lot of daily search I need.
And I would rather to just get a pinpoint answer. So I use search, obviously, I use search a lot.
And it's actually easy for me to do that on R1 because I probably on something else, you know, all the time.
I probably have, you know, my slacks, message coming, I probably on X and checking on Twitter, I probably on the email thread.
Then for that blink of the second, I just press a button and ask a question and give me answer right away.
So I use search a lot. I also like the music because again, music today is more of a companionship other than dedicated activities.
You don't just call your friend, be like, hey, everyone, what you're up to this afternoon, you'll be like, hey, Jesse, Alison, three hours on music.
Like, no one do this. Music is on the goal, it's on the backhand, it's on an ambient.
But we do need music. So music is always, you know, kind of like, sit behind the scene role, kind of like that.
So I give you a live example, you know, sometimes I play games and I'm literally focusing on a game.
But I can then press a button and change the song and play the entire different connection of my music library.
That feels really cool. And giving by the fact that the music starts immediately, 500 minutes.
Another big scenario is, you know, living in Los Angeles or anywhere virtually America, I guess except maybe help you drive a car lot.
So there's a lot of hours that has been on road driving the cars. And no matter what kind of different car systems you are with,
you know, R1 connects to my Bluetooth. And again, I just press a button and I send navigation, ask questions on the go.
I change the added stop. I search for a gas station or charging spot. And as well as control my home media.
I think that I really, really, it's a really, really satisfied experience.
One thing that's really satisfying to me is that at least finally now people can experience other kinds of search products, search experiences,
rather than having to, you know, be dictated upon one decision on swipe down your screen because someone's paying a lot of money.
So I think that's great and like, you know, we should push the field forward with like more cutting edge experiences for others.
And I hopefully, you know, that's also another motivating factor for people to explore this new device.
Yeah, definitely, definitely. And finally, I want to mention a new thing that I'm currently super interested is a capability of the camera plus search.
So vision plus search. Yes, that's pocketbacks basic, right? You point to the Pikachu and ask what this is.
And if you retire Wikipedia density of answer, I think that's such a great educational tool for the younger generation.
I mean, I grew up. I want to learn everything about everything.
And now think about it if we can put a real pocketbacks in the future generation of kids.
And then you take them to the new take them to the park.
And they just start exploring using this vision plus advanced search like propacity.
I think that's a very, very, very, very exciting scenario.
Yeah, I guess that's a good, good point to wrap this up. And, you know, I'm super excited to work with rabbit.
And I want, you know, hopefully this is the beginning of something amazing for many years to come.
Yeah, sounds good. I mean, it's just getting started.
We know that our, both our team are working together on some really, really exciting things to make it much, much more than what we should do today.
But I think, you know, the key for us is we want to work with the best services are devised.
And search is such an important catalog that we simply want to choose the best.
And propacity is the best right now in the market, giving back my own test.
But we know everyone probably in your mind, you have probably ten more ideas.
And I probably have ten more ideas on how we can make it even better.
That's where we're heading. Yeah, absolutely.
Cool. Thank you, everybody, for joining. And hopefully you can go and purchase your R1.
Yeah, sounds good. So, so we will send the email with the practice we called.
And if you just follow the announcement on our Twitter, we will make sure this is a very small experience.
And yeah, thanks for the generous offer again, everyone at the flexibility team who are super excited.
Thank you.
All right, bye-bye.
This space was downloaded via spacesdown.com. Visit to download your spaces today.
Speakers Summary
SpacesDown ChatBot