[Applause] hi so uh all right i probably do have a little bit of headset face still because i spent quite a bit of time in venues through this day and uh when there was a good crowd of people there it did sort of feel like connect we’ve obviously got some work to do on managing the population dynamics as we get people through and keep them from being loners in some places and more conglomerated than others but i think it’s actually a pretty good sign so i’m you know after connect last year i was given an award for a lifetime achievement in vr and i gave a kind of grumpy acceptance speech saying that i was really not satisfied with the pace of progress in vr and generally speaking i’m still not i am most of what i talked about last year at connect is still unresolved and still relevant but shipping uh quest 2 in the middle of a pandemic is something that i’m really proud of and i we didn’t think that it was a slam dunk even at the very beginning where for a long time we had thought that shipping headsets on like a two-year cadence was probably about the right thing where i’m you know you go ahead and maybe you alternate between pc and mobile or something like that but with quest shipping middle of the year going ahead and planning on shipping a quest to in a year and a half instead of two years it was already pretty compressed and tight you know there’s all these gates that you go through where you get your engineering samples and you get ready for production and you’re supposed to take all of these steps where you’ve got opportunities to fix things and it was gonna be tight so when we got told to not come into the office just stay home i really thought that uh that it was over that we had blown it and it’s a different thing when you aim for march and you slip a little bit but when you aim for the holiday season and you miss it it’s a real real problem so uh the fact that we are ready and it is going to go out the door is really pretty great and my current thinking is that the the electrical engineers and hardware engineers in general on this they must be channeling a little bit of scotty from star trek where they maybe hold a little bit back so under pressure they can pull off a miracle which is definitely not the way software developers work but it seems to have really paid off for us this time i am someone had commented that the rebranding here with oculus connect becoming facebook connect and rebranding again to facebook reality labs that this does kind of feel like the end of the beginning and they might be right now personally i’m very fond of the oculus brand and as far as i know facebook is also and that there’s no there’s no signs of it going away but the larger organization vr is less than half of what the organization is uh and more than half of it is doing researchy things like michael abrash talks about some of this stuff being a 10 year time frame and and that’s worthy of being called labs in some ways but i personally like putting resources on products that are going to ship in the near term rather than technology research things and 10-year windows so i’ve always focused on the vr side of things but facebook reality labs is reasonably representative of the larger organization now this could be kind of insensitive but um you know global lockdown and pandemic should have been sort of the global coming of age for virtual reality where this was the opportunity to defy distance defy reality and all of that i but we’re only sort of accidentally benefiting from this where not only were we sold out most of the time we couldn’t uh you know we couldn’t just produce units that people wanted to buy and that is not an easy thing to just rapidly change we couldn’t just say hey the demand outstripped our expectations let’s ramp up a whole bunch more uh it’s a long process and it’s been unfortunate that most of the time here we’ve been sold out but worse all of our social experiences were basically killed or deprecated um you know we had room spaces co-watching and i all those are gone and i you know what venues has been in maintenance mode for this entire time so we made this huge bet on horizon and we’ve had all these people working on it and you’re seeing some of the fruits that finally with the venues 2.0 now but basically we weren’t ready you know we had all this effort going into it we had let the previous products more or less rotter go away and i had made a pitch that well can we just resurrect rooms for this time i mean rooms for the pandemic here uh we can spin it back up there were people that were enjoying it we were you know we were getting the point where it was a good experience but it was still three off optimized it wasn’t set up for for sixth off uh we could have run it but nobody wanted to basically stop the scheduled things and everything that was already planned for this time

to go work on something like that so frankly i’m kind of embarrassed about our social story here but thankfully it’s been the slack’s been picked up by a lot of third parties and i frankly envy the learnings that they’re getting out of all this where you know we see the numbers and we see lots of time spent in these and there are lessons that you just don’t learn even with a big well-resourced team that’s set out to go you know told build something great but you just it’s a different world when you’ve got thousands of real users going through it versus just your internal testers we’re getting close to the point where we’re going to kind of learn those things but a lot of it’s going to be relearning things that other people already have so unfortunately like location-based vr has probably taken a terminal hit from this it’s going to be a long time before people probably feel comfortable going someplace and putting on a shared public headset and that’s you know too bad there’s been a lot of companies that have tried innovative things and some of the experiences were really pretty magical but i i think it’s just going to be a tough business case there but on the other side exercise is a primary application of vr is really winning where at the beginning we thought that this seemed a little unlikely that sweating in the headset just seemed like it was going to be a real problem we had a lot of worries about fogging that we had on some of the earlier headsets that turns out not to be as much of an issue on the standalones where they pump out a fair amount of heat and keep keep the lenses up uh but it seems like people are okay with kind of making a sweaty mess in their own personal headset and i i mean i use practically every day vr is part of my kind of personal exercise regimen i kind of trade off where i’ll do you know my expert plus beat saver stuff one days and then the next day is i’ll put on arm weights and work from that you know as extra exercise so some of the fun stuff is we are adding global system-wide tracking of some of the movement so we may need to stick in some extra thing there for you can click it when you have extra added weights or resistance i but that’s been you know a real positive thing now everyone’s video conferencing now and the limitations of that are grading on everyone as you have to kind of do it sometimes multiple times a day where video conferencing on the one hand yeah it’s uh it’s got a lot of positives to it but at the low level sort of speeds and feeds level of it the the latencies are are just really not that great i mean on the one hand even cell phone latencies are not good i so many people young people just don’t even never experienced hardline wired telephone conversations and don’t understand how bad cell phones are and of course video conferencing is even worse uh the last time i timed some of these things we had like an atnt uh cellular call was 700 milliseconds latency and the last time i timed it in horizon it was a little bit worse like 770 milliseconds it may have improved a little bit since then we do have sort of a mandate to go lean on this a little bit but i want people to not make small weak goals you know none of this well we want to be better than a cell phone we should go really hardcore and say well we want 50 milliseconds of latency for conversations something better than any other kind of electronic communications medium that we’ve got right now you know that would require going in and writing custom firmware for the the noise cancellation and the way the audio codecs work on the headset doing really top-notch networking having things set up properly all through everything where like we have layers of abstraction like on the oculus platform now we have a shared microphone service which adds this extra unnecessary layer of indirection and more latency and jitter and things that we have to to worry about scheduling around and backing off i but but it’s all fixable i mean and maybe we probably don’t get down to 50 milliseconds but we can cut this in half we could make really significant improvements and this is a tangible benefit in uh in all the times that i was talking with people this morning in venues we many times ran into that same thing multiple people in the room uh no you talk no hugo uh just the whole it’s not instant like you’re right there where you get to do the social things that people evolved over hundreds of thousands of years to do well and that sense of we talk about presence in vr you are kind of getting that sense of you are there with another person you hear the audio but lag in reality like that it it grates and it is fixable and it’s something that we can make a difference on and that could make a real difference in a lot of these meetings when you have meetings and you get into sort of the i am the just the way you do your video conferencing it’s not as interactive as it could be it’s not there the flow of information is not what it could be it’s somebody’s talking everybody else is listening there’s not as much of the the give and take as you would hope for in a good in-person meeting so i think that our technologies have ways to improve that i like the the spatialization again in

venues is pretty good i was happy hearing people even up on the balconies or down below i was able to locate people but we can get better there was a good video put out recently by the frl people their audio team about making audio that was often indistinguishable from reality where you could have somebody talking and if you had your eyes closed and you could guess is this a computer projected through the headphones or somebody actually speaking in the room and you could get very close to kind of coin flip levels but i’ve been pushing for us to go ahead and do that with our headsets where you set the headsets up and you’ve got the microphones and you’ve got the headphones and that’s a wonderful case where we know what the hardware is on both sides we can accurately map all these transfer functions and you can make that really sound like they’re there an interesting side effect of that is you lose the ability to have an actual volume control on people because there’s no greater or lower volume if you’re doing that there’s only the right volume if a person is four feet away over on that side talking at this volume there is a correct volume to be coming out of the headphones and just going ahead and doing the right thing and that sort of let’s match reality i in a lot of ways this is kind of like things the revolutions that happened in rendering when you went from just well we just tweak everything you give the artists a lot of knobs and you turn it into physically based rendering when you’re talking about energy conserving uh light reflection from surfaces where we should be like that for our audio too where it should be physically based audio where it just is what it is and you give up some degree of control there but it feels right i am and then also in that way of matching the world one of the things that it still makes me smile about our our systems is matching the reality with the the virtual view where the fact that i’ve never liked the fact that we still have this light big light leak around the noses there’s still some of the early gear vr headsets that blocked out a 100 of the light that were in some ways a better experience there but all of our standalones have had this reasonably sizable gap around the nose so you can look out and see your hands a little bit there and it just fills me with a good warm glow when you can take the controller and kind of like look through one eye hold it there and see it flow seamlessly from reality up through into the lens and the whole thing fits together and it’s not there just naturally when early on of course you’ve always got things where they look offset and broken but people go in and drive through all the different parts of the stack and it’s pretty complicated with the tracking service and the the application the compositor and distortion all these things but you have to get everything right and then you’ve got reality as something just seamlessly flows into the simulation sometimes i talk about how it would be interesting to go ahead and take like the lighting in a room and synchronize that up with our screen flashes and say well we argue about how much does it mean when you’re at 60 72 90 120 hertz or something it would be fun to go ahead and take that and say all right here’s our headset let’s also synchronize the lighting in the room no external ambient lighting but just high frequency leds pulsed exactly like that and then map it the same way where map the exact room in virtual reality and then be able to move the headset off and kind of squinting off the side and have it be a completely seamless situation so i you know i think that that matching reality is an interesting approach and eventually when we have people in there with sort of the codec avatar level things that’ll be a whole another level but that’s not coming real soon for uh for our mobile systems i also on the video conferencing side of things you know we video conferencing is generally done in a you want it to work everywhere you’ve got every possible client endpoint going over the web different webrtc native applications and so on and it means that again it’s not pushed as hard as it possibly can be as opposed to something like oculus link where we are really pushing hard to minimize the latency and you know sending fractions of the screen over decoding it pipelining it in all these different ways uh certainly we can apply that type of latency to conferencing in virtual reality but maybe something like that moves over to portal in some way to improve that kind of latency over there apply some kind of oculus link level grinding on the low-level latencies to kind of generalized video conferencing systems so uh the original quest turned out to be more right than we really expected and the biggest problem was that we really didn’t make enough of them but quest 2 is better faster cheaper and we’re making a ton more of them and it’s really rare to be able to honestly say things like that where you improve on all the axes because usually you wind up saying you’ve got a tripod of different constraints and you get to pick two you know one leg’s got to give in some way but this is very close to a pure wind and i will you know by my normal methods here point out every last little thing where

it’s not a pure win but on net this is great so the biggest thing is every quest app should look better i on quest 2 where it’s a higher resolution screen i am it’s faster processor so things should be smoother we automatically adjust up the resolution so things wind up looking better uh the actual resolution is 3664 by 1920 but it’s a full rgb stripe which compared to quest with the pentile oled screen that means it is a little over twice the number of subpixels uh or conversely a little under twice the number on go which always had a few more sub than than quest did and so what this means for content is i’m in our current systems go and quest if you looked at something like netflix where you would have a screen that was fairly large and occupied a good fraction of the field of view you could have a 1280 by 720 screen and it would be great in the center and it would be aliasing a little bit out at the edges on systems when we were able to turn on super sampling like in browser or fandango now you could have a 1280×720 screen that looked perfect pretty much all the way out so on quest 2 that means basically we can bump those numbers to a 1080 screen so you can either have a non-sub-sampled non-super sampled screen that stretches kind of like imac style or if you’re super sampling it can turn into something that’s just home theater style or you can take something that’s a little lower resolution like a 720 screen and it now works out as something at monitor distance like a real person’s monitor rather than these gigantic screens that we’ve typically used um but this means that you can have multiple 1080 screens you know in a big setup i mean like i you know i have triple screens here set up for my desktop work uh you can set up triple 1080 screens or something they’re bigger now in vr but this is getting to the point where you can start doing real work with it it might have some advantages over laptops in some situations now i’m the newest feature that we’ve got that we’ve never tried on any of our systems before is this notchy uh interaxial displacement adjustment where quest had fully smooth adjustment where you could kind of analog slide it to exactly where you wanted and quest had two independent screens the screens moved with the lenses and the entire thing moved with quest 2 it is a single lcd screen and the lenses move on top of it and they only move to three separate positions we’ve got kind of a standard and narrow and a wide and in wide because it is just a single screen when you move them all the way out you do wind up sacrificing some field of view so in the field of view pulls in a little bit where you’ve just got black at the edges so it’s not ideal people that have wide fields of view will notice that lack of field of view in uh in quest 2 relative to quest but it’s probably again a net the right solution where it’s better than rift s or go that had no adjustment at all and it has allowed us to get this kind of much better screen for it now in general the the lcd versus oled trades we’ve popped back and forth between all of our different headsets on this and most of you know the kind of the trade-offs here where the big ones are with oled you get a little bit less latency because they change instantly uh you have a little bit purer colors and you have at least in theory pure blacks but because we had to do mirror correction to kind of get rid of some of the speckling and inconsistencies on the displays we wind up not quite getting that whole win and we wind up getting this black smear and sometimes two frame rise at low ends of the uh the display range so in general i’ve thought that like on go and rift s that the lcds have netted out a little bit better even at comparable resolutions now when we’re at twice the subpixel resolution i think this is a really clear win and it is clearly the best display that we’ve ever had but the the one drawback that still does matter is the latency hurts you a little bit where you have to you command the lcds to switch but it takes them a few milliseconds to actually get done changing and then we blast the backlight behind it we still do have a little bit of an advance here relative to the previous ones where uh previous displays had a single backlight we command it to flash and it would do a one millisecond burst over the entire thing now we have that split uh into two pieces so the left eye and the right eye kind of get their own separate bursts which gives us a little bit more cushion to pull things in where we don’t have to wait till it’s scanned all the way across and the very last pixel has had time to transition before we blast the screen we can scan out half the screen wait a while while the second half of the screen is scanning out blast the first eye blast the second eye so this is sort of a very limited version of the rolling

shutters that we would have on gear vr or on quest where it was continuously scanning out and there we’ve always had arguments about the relative merits of rolling shutters versus glo global shutters and somebody did point out that with the 2i flashing at one part in home when we’re scrolling the content sideways at kind of a constant rate because there is this slight delay between the left eye and the right eye for the same content being rendered it felt a little bit like um the panel was moving in and out in distance because your eyes would recognize a little bit of a disparity because of the same image coming at different times now we always had that on the rolling the full rolling shutter systems but it is still something that makes a little bit of a difference so right now i am at the same frame rate there is a little bit more latency on quest 2 than there is on quest 1 because we have to wait a little while for the lc the lcd to settle and before we flash the backlight we can usually claw that back by running at a higher rate now we were designed to run at 90 frames per second on this system we ran into some kind of last minute problems where it’s shipping at 72 by default but we’ve got a little experimental option that you can turn on where what we missed for a while was that guardian now runs as a whole separate process and a lot of our stuff now can take our compositor takes input images from multiple different clients where like the the pop-up dialogues i the actual game screen aui all these can come from different places um but it winds up causing a little bit of a problem now that we have both 72 hertz and 90 hertz and possibly 60 hertz for some media cases but guardian rendering at a different rate than that winds up making the whole thing feel pretty juttery so we’re shipping it initially at 72. we’ll sort this out and get dart and guardian to dynamically adjust to the uh the right frame rates for that i but when you run it 90 then our latency is you know it’s pretty much a wash i we do actually have the possibility of running this display at 120 frames per second uh this is again one of those cases where the display engineers are like well this is we designed this for 90. it’s certified for 90 but somebody just went in and said hey it kind of runs okay at 120 and just like with many of our previous cases with this where we get into all these arguments saying that all right it’s not exactly certified for it uh if you have a really cold headset like if you were left it out in your car overnight in a cold climate and you bring it in you put it on even at the normal rates it’s going to wind up having some ghosting and having some problems and it would have much more at 120 until it i you know until it winds up warming up to the point because lcd switch kind of temperature dependent wise but i and realistically there’s not many mobile applications that could run vr at 120 hertz well but there are some like our shell application you could be in browser or something like that i and watching it 120 hertz and for some people there are still a slight difference between 90 and 120 hertz and i’d love to see something that really competitive games like beat saber or something uh being able to run at 120 frames per second i it’s going by the way things have gone historically it probably won’t happen but i hold out some hope you know to kind of i go again with my star trek metaphor there it’s like you know scotty give me warp 10 the ship can take it you know we can hold together on this um but it’s that would carve a little bit more latency off still it would be a little bit more stable we also have another new tool for this that i was super excited about this idea of dynamically firing off the the retraces instead of always waiting for exactly whether it’s 60 72 90 120 or whatever the idea that on pcs you have a lot of monitors that just let you run your frame when you want to and that is great for smoothness instead of getting into any of these lurchy steps where you just barely miss your the frame rate that you’re targeting for i you just run it out when you want and it is a almost pure win on a pc monitor but the difference is that that’s full persistence display where the backlight is essentially always going or maybe it’s pwning at some incredibly high rate for dimming but the difference is in virtual reality we just have this one blast of the backlight and i had hoped that it would be okay i’m just like reading it okay we missed 72 it’s actually only 70 frames per second but our early experiments show that modifying this much dynamically leads to a visible flickering but i’m not sure i’m not convinced that this is the case yet this is one of these things where we need to get back into our laboratories and put on some really high-end sensing equipment and run all of this because there’s a lot of things that could be going wrong with this

i because we have kind of different timing for the backlight flash versus the scan out and it doesn’t look good right now but i have some hope that we just haven’t we haven’t done things exactly right and that we may be able to get some more out of this and that would be great and we also had somebody made a a clever suggestion for a fallback plan where if the killer is the flashing of the backlight it’s possible that we might leave the backlight at an exact cadence but delay the scanning out until we’re really ready and that could get rid of a bunch of our cushion where it would still be coming out at exactly the same time but if we started the scan out late it might leave a hint of a ghost but that would still be better than an entire frame judder so we have some fallbacks that we might be able to pull in you know one way or another but on the topic of extremely high performance systems i’d like to make a bit of a pitch to game developers to consider possibly architecting the games where it really really matters you know we have these things like i’m you know attica and beat saber and synth riders and things that are competitive that i that people really care about the difference between 72 and 90 and 120 and it’s great to see in the pc space people doing these tests showing that for elite competitive gamers they can tell the difference and it makes a meaningful difference in kind of objective tests the difference between even a 240 hertz and a 360 hertz monitor so i like to think that there’s some headroom for doing uh kind of exotic things here and the way games are structured right now is you get you start your game frame you say what’s my predicted display time you ask the vr system for that and it tells you that whatever you do now i am going to wind up showing to the screen usually like 48 milliseconds or something in the future it has to go through the game simulation the rendering simulation the gpu needs to draw it the compositor needs to put it together it’s this long line of things so there’s this 40-something milliseconds of uh delay now you can carve that down if you don’t have extra latency on you can pull one frame out of it you can kind of phase a line different things and you can nudge this in different ways but it’s still a substantial double digit number of milliseconds and those wind up being extrapolated you know if you’re pulling your arm out and the trigger gets pulled here it’ll return saying okay trigger went down but it’s predicting a little bit further ahead this extra 40 or 50 milliseconds and that’s just the way the games are set up right now you get this stuff you do your simulation and then you do your rendering and it goes through the through the whole pipeline but it’s possible to change this so that instead of saying i am simulating what you’re going to do 40 milliseconds from now you could be simulating at an arbitrarily high rate there’s lots of stuff where if you’re doing simple things like some of these things that are almost on rails and you’re just pointing at things that could be done at a significantly higher frame rate you could do it at 200 300 whatever frames per second you know you can go all the way up to we deliver a thousand unique imu samples per second uh and it would be possible to do some very precise positioning with that but you would have to structure your game so that what gets rendered is not just what’s been most recently simulated so you’d have to do a little bit more decoupling but it’s an interesting thing that i’d like to see somebody take a stab at and then do a rigorous objective a b comparison you know set somebody up here is conventional frame synchronous rendering and here is 500 hertz uh super sample stuff and see if it actually makes a difference it might so i’m now link is supposed to be the convergence replace pc headsets where we have i am you know quest two is supposed to basically be the future of the rift line as well as the future of the quest line and on day one uh when you plug it in it’s still not going to be quite as good as uh you know as a pc headset you know we’re still doing the compression there’s some extra latency there are these things that we go through where it’s pretty darn good but i you know it’s not as good in every respect now we have the potential for actually making it better where we can run at these higher frame rates uh you know while 120 hertz might not be anything that a mobile system can do there are pc system you know pc games where that might be more reasonable and we have a much higher resolution screen so even if we do wind up doing video compression we may be able to get higher quality images in some ways we may be able to do various image amplification things so i have reasonable hopes that it can get better i mean last year i laid out some of the opportunities that we have for really taking advantage of the full usb 3 bandwidth on the link where we could have much better latency and quality than what we’re doing today but it would be a very different system uh so i still think that there are some useful things that you know that we can pull together with that uh there’s gonna be easy things just turning up rates for all things on

quest two where you know it’s got better video codecs we can do 8k video i am you know the latencies probably aren’t significantly better but we can uh we can throw a lot more bit rate at it where it’s been you know almost it was it was silly shipping a usb 3 cable initially and making a big deal out of it when we couldn’t ship more than 150 megabits and when you know usb 2 cables worked just fine just a couple milliseconds more latency in the actual wire transit time but we’ve got more opportunity to throw more through it but still nowhere near the full bandwidth with conventional encoders but there are other more dedicated things that we can do when we get the time to it it’s possible that the qsync type things with variable frame rates if it turns out that we are mistaken with our current approach and the flickering is not absolutely inherent in dealing with that on a low persistence display that should help with link being able to make it low persist i’m dynamic frame rate uh all the time would be a positive thing for that and we should be able to if you’re going to plug a cable into from the headset to a pc there’s things that we should be able to do kind of going the other way like we should be able to make recording and casting and things like that work more reliably when you’ve got a wired connection to a pc you know there’s lots of places when you are in wi-fi hell zones at conferences and things that you just can’t expect to do anything wirelessly i’m you know whether it’s casting or i’m streaming or anything i am you know there would still be another thing that we are not quite as good at though the rift s has five cameras so there’s an extra camera and there are some controller poses that are still will remain problematic that i am you know that still just won’t be as good but we do have other tracking improvements coming along where the cameras that we have on quest 2 are basically the same as on quest that has some implications for tracking and for resolution various other things as well uh we still haven’t announced a full like wireless connection system for link and we have these interminable arguments internally about this about quality bars and i keep saying that you know i love the fact that we have i have existence proofs where whenever we argue about this i can say right this very minute someone is using a wireless vr streaming system and getting value from it you know it is not as good as being wired it is not as good as we might hope it might not meet your personal minimum quality bar but it is clearly meeting some people’s minimum quality bar and delivering value to them because they keep coming back and doing it so i continue to beat that drum where i you know we should have some kind of of an air link and then it gets even more controversial when we say well if you’ve got an air link that you can talk to an arbitrary ip address then well what about cloud vr gaming and that just turns the knob even further there where okay obviously it’s even worse obviously more people are going to find this unacceptable and it will be a terrible experience for more people but still i am quite confident that for some people in some situations it’s still going to be quite valuable so i think one of the probably big shocks for people with the quest to reveal or leak was that i we are using the qualcomm xr2 eye chip set where for the first time we are using a state-of-the-art chip uh it was the right thing for us to do on go and quest to be using more trailing edge chips we did not need the extra grief and hardship of working with something that hasn’t been fully debugged worked out been through multiple other vendors for it but again our hardware team has you know has been maturing in multiple ways our all of our software is maturing and using the state-of-the-art chip on this has you know has delivered some real benefits i am it is a bigger boost on the gpu than the cpu which which is good because we’ve got this higher resolution screen that can run at faster frame rates you know gpus generally deliver value by scaling wider you know you just get to have more shader units and you can generate more pixels and it winds up by being a nice thing where it still takes more power but you can still keep it at a lower clock rate and and derive more value from it um the cpus unfortunately while if you look at benchmarks we can run them at fully twice the speed that we’re running them at right now for the way we’re shipping quest two but unfortunately that would mean that it takes four times the power when you when you get performance by cl cranking up the clock frequency you wind up in this quadratic eye power and thermal regime which which winds up being really being painful and this is only going to be getting worse for us where uh the power and thermal on mobile i mean we went through this really hard on gear vr where we had all of these thermal problems and things were overheating and shutting down and it was one of our you know one of our big complaints like a large fraction of users would wind up using it until it overheats i we got away from that on on quest especially where we had

active cooling we had a fan going there that um you know that could basically cool the entire thing running at pretty much peak clock rates uh but with i with quest 2 we’re at a point where we have a fan and it i you know it adds a lot to the cooling capability but we are still not close to being able to run everything on the chip flat out so we have to kind of carefully balance out the different things here and it’s interesting where a few years ago i made the comment where i’m trying to get people used to the notion that the mobile may never catch up with where we were at at the pc like at a time maybe a 1080 ti or something you know a high-end pc system which is many many many times faster than the mobile chips and we’re still getting faster but there’s this hazard that a lot of people think that moore’s law is going to you know is basically nearing the end but people have been saying that for a long time and i mentioned this to jim keller who is one of the really senior team leaders for a lot of important teams at amd apple intel and tesla and i said you know i’m worried that mobile may never get there to where we are on pc that we’re gonna have to start learning to live within some of these limits and he was basically now we got this i there are there are a lot more moore’s laws far from dead so i’ll be really happy for that to be the case i’ve only got kind of middle confidence on this prediction and so far as we go from kind of the 805 on the original note 4 to where we are now with the xr2 that is a lot of performance that’s come up from there that was a dual processor system that interestingly was not running at much lower clock rate than what we’re running at now and could in fact run it faster than what we’re clocking uh the base rates at here but the gpu is a whole lot faster and instead of dual core we’ve got eight cores you know obviously we keep most of them for our system software but it is still a really significant increase in power so things don’t give up on moore’s law yet things are still working pretty well but on the other hand you still do have these pc super computers where nvidia just announced their bf gpu at the rtx 3090 which is just an astounding system where in a in a big pc you can be drawing 500 watts of power as you’re powering this and heck you can stick two of them in and put an envy link between them so there will always be things that you can do on these big systems that can’t be done on the self-contained mobile systems so doing it i you know having link and having our ability to continue to take advantage of those amazing things on the pc is is a really great thing again this is what i always wanted from the beginning i wanted the self-contained system that could plug into a pc and take advantage of all that power so one of the things that we introduced on go and quest that didn’t work out quite as well as i had hoped was this fixed foveated rendering uh the idea being that we know that our optics are such that they’re clear in the center and they wind up getting they have more problems at the outside you can’t see as much clarity so we should just render less pixels out there and qualcomm uh did a good job with this extension that allowed us to kind of block up the things the all the renderings divided into into bins on the qualcomm chips and we could assign bins to be cut into you know half or quarter resolution and it seemed like i am a really powerful thing where you could set this up and you could render half the pixels and still cover the screen but it turned out that when you go that far it really didn’t look very good and it only wound up giving 15 to 20 percent more performance and there’s a number of reasons why it wound up like this and i thought unfortunately a lot of developers just kind of left it turned it on cranked it up to the maximum said i want all the benefit i can get and there are a lot of applications that i think made kind of a poor choice there where it really doesn’t look that great i you see things especially anything like a sign anything with text as soon as it goes like halfway out to the edge of the field of view and all of a sudden it’s this ugly pixely mess i am and there are a few reasons for this one that i there were some subtle things with the way the blocks were aligned they weren’t set up as i as symmetrically as they could be so sometimes you add lumps protruding in a little bit further than they otherwise would have i am and even though we say all right it’s half the pixels instead of being what you’d expect from that with a nice kind of even bilinear stretch up from that what happened was the pixels were doubled uh it would render the lower resolution then it would double the pixels scan them out to the texture which meant that instead of a smooth interpolation between two neighbor pixels you had no interpolation for one pixel and then interpolation for the next one that’s why everything looks kind of blockier and notchier than you’d like now we have some fixes for this where i we have some new approaches to this that let us get the proper bilinear approach and it doesn’t waste the bandwidth of instead

of writing out the doubled size it just writes out the normal size and in the compositor we’re able to do the interpolation there so it saves us some bandwidth as well as i you know avoiding the pixel rendering uh and there’s also some really twitchy geeky level things where instead of doing these teeny tiny little bins we’re able to pack a bunch of the bins together and get them rendered a little bit more efficiently so i’m hoping that the fixed voviated rendering winds up being both higher quality and a little bit more of a performance win going forward uh but one of the things that we’re doing that’s that mitigates some of this is we’re moving to an automatic performance management system where instead of having the developers set like the clock rates and then the fix foveated rendering we for a while now have been doing dynamic clock rate management where we mod we monitor the frame rate we clock it up as your frame rates is starting to dip but what i had suggested last year that we’ve got implemented now is that you can ask for it to go ahead and make the fix foveated rendering also dynamic so this means that you start off at whatever your minimum clock rate is the clock rates on the gpu go all the way up and only when the gpu is maxed out do we start bringing in the fixed-fovated rendering from the minimum level and i’m really happy with this this means that most applications most of the time can then avoid the soviet rendering but then when they get into some really over committed oversubscribed uh scene it’ll come in just as much as it needs to and then start to go away when it’s no longer necessary and we’ve got some more things like that that we could conceivably take advantage of where the exfoliated rendering is one way to deal with rendering less pixels another more direct way is to scale the entire buffer down and last year i had been suggesting that with the state of exfoliated rendering then in many cases developers that need five or ten percent might have been better off just scaling the the entire screen down hopefully with these fixes ffr is better for a little bit but you don’t want to go all the way to where it’s still looking bad at that point you really would be better scaling the screen down so maybe we can start taking advantage of that and then maybe we can also start taking advantage of slowly ramping the frame rates down where again it’s better to drop even to 60 frames per second and maintain one frame per refresh than to have it be a juddering mess afterwards but all of these do still make me nervous because many of the things that we had to do in the original gear vr days to make it possible was stop doing all of the performance management things that samsung was automatically doing on their phones now they were much more obsessive about battery life than maintaining um you know one-to-one frames so it’s not clear that it can’t work out well but there’s a hazard here you know the more control we take and we start making decisions uh developers always can do a better job but most of the time they have a million other things to worry about they’re worried about the game play rather than whether they should be tweaking things on a frame by frame basis so i think it’s going to net out to be a good thing having us take over more and more of the control there but i always do recall that there’s part of the system called the mpd and it’s part of the power management system but somebody kind of uh mocked it as make poor decision and there’s always that hazard at the system level about thinking that you know better and making decisions that wind up hurting the app so we have to kind of always be a little bit vigilant about that uh on the power budgeting side the cpu clock speeds one thing that’s especially unfortunate the uh the xr2 has not only big cores and little cores like we’ve had for a while they have one of the big cores is a prime core which has a little bit higher clocking capability and more cash and it should be kind of you burn more power but it could run faster and for a lot of game systems that could be a really great thing i mean running usually you have your uh kind of your game thread or your render thread that you know is the the critical path that it’s the long pole and that’s what always causes frames to miss and it seemed like that might be a good idea pin that to the prime core but it turns out when we did a bunch of measurements the prime core just uses more power even at the same clock rates i mean i guess bigger cash means something there may be other architectural issues about it but it’s really unfortunate where we wind up clocking the prime core down a little lower than we clock the other gold cores i you know if we had you know if we just had more uh more thermal margin then we could you know we could run these things significantly faster we could unlock other i you know other performance capabilities and you know in some other ways i regret that i see us becoming almost more like samsung as the company matures and ages where when i started on gear vr i remember doing systraces and going what is all this garbage that’s taking up our cpus what are all these processes running and i these are not part of the application that’s running now and we try our hardest to keep all of our system stuff

off of the cores that are reserved for the games but some things kind of creep on and it’s not free the other cores still do have parasitic bandwidth losses and we have more and more processes and services that we’re spinning up for different things and it creeps into everything it’s i the whole independent processes are a leaky abstraction and we have to continue being wary about this but as the company grows everybody wants to have your own team have your own application have your own part of the system there and they run independently and they wind up tripping over each other in a bunch of different cases i made one i knew it was i you know it wasn’t going to go anywhere but i had made this point that all of these services that we have going on it could just be one monolithic service it could just be we could throw it all into combine vr shell vr runtime guardian horizon all of these different things we could put it into one process all the teams would hate it because they’d be stepping on each other’s toes but it would save us memory and resources we wouldn’t have like the frame rate correspondence issues we’re having with guardian but this is trying to push back the tide it’s not going to happen i’m i’ll fight the good fight as much as i can there but this is going to be a kind of a continuous problem now we got a lot of benefit in quest especially from using the dsp on the on the 835 we have several new toys on the xr2 to deal with we have some little computer vision accelerators and we have the tensor accelerators for some of the neural network work and we’re going to get good value out of this and i’m it’s a good thing that we don’t have too many people really twisting our arms for like external access to this because we wouldn’t have the bandwidth to deal with it i’m you know i know if i was writing some from scratch custom engine for vr i would want to get my hands on these but they’re just i they’re not coming to user user space for you anytime soon unfortunately but it does let us do more inside the power budget that we reserve for ourselves and you know there’s the idea of custom silicon like what can you do for this there’s for the most part like the gpus are doing what vr mostly needs for it and some of these things with the computer vision and tensor accelerators are are getting us some good value but there is always this case of you can always specialize something a little bit more and make more take more advantage of it but the lead times on this are really long qualcomm was asking us it’s like hey what should we put in these things really early on i mean way back in gear vr days again when they’re planning xr2 they were coming and saying uh what kind of custom stuff would you like to see in there and i was kind of like i mean heck that’s years in the future i don’t know i don’t know what our computer vision stuff’s going to look like most of the neural network acceleration stuff wasn’t even really on my radar at that time so they had to have a lot of foresight to to be looking ahead to get those things in and land and i’m happy to be taking advantage of them right now so the on the temperature limiting side i am having some of these uh little battles internally that are flashbacks to samsung where in with gear vr we had all these issues about temperature shut down and with mobile chips there’s there’s really two different thermal limits there’s the thermal limit where the very low level stuff shuts the chip down because it says okay something is about to burn in a non-recoverable way and it just shuts it off but far far before that is the system designers decision about this is as hot as we want to allow the uh the chip to run and you make that decision based on like well what’s going to be the case temperature in different places do you care about average or the very worst case what’s the hottest spot on the case and you know there are there can be legitimate differences of opinion about where the right place to draw that line is where there can be a temperature where you can say all right this temperature if you held somebody down and you just like hold this against their skin and they can’t move for 30 seconds that this is going to leave a mark on them i’m you know but i would say in many cases that if you’ve got something that’s really hot you just move your hand and there might be an opportunity to let some areas of it get a little bit hotter than others and we’ve got there’s lots of options on cooling where you have the choice between a completely passive solution like go was very nicely engineered an aluminum front plate heat spreaders behind it and it was possible to overheat go but for the applications that it had it worked pretty well i’m you know with i with quest 2 we’ve got a fan inside there and it can it can spin up and move quite a bit of air when it needs to there was one build where the fan was broken and it was on all the time and i was like i thought something was wrong with the audio circuit because it sounded like there was a whole lot of static coming through the audio system because it was just making this buzzing all the time and there’s the trade-off with fans where you can be

small and fast or big and slow and heavy goes with the slow it’s kind of the the helicopter versus jet engine uh way of moving air i but there are limits to what we can do there although we could thermal engineer a lot harder than we do right now but you know in the end my kind of turbo ferrari style of uh engineering is probably not the right endpoint for consumer head mounted displays we don’t need melted pistons and broken input shafts here but i do try to tug a little bit here where there is a tendency to perhaps over conservatism in some ways and we can take a few more steps towards you know a little bit better performance now in some non-obvious ways with quest 2 we really are optics limited now where instead of i being able to look at the screen and say well clearly i can see screen door effect i can see the individual pixels i in now this is for applications that do everything right now there’s there’s a right way to make peak quality uh content you know if you wind up with a super sampled layer with properly sized content that’s srgb and all the right things set up you know at that point overall but a very small part of the screen i am we are rendering pixels that wind up not being directly perceivable i you know the obvious thing is that the quality of the optics where we’ve always had the fresnel rings around the outside that casts our our god rays in wind up kind of chopping up things at the edges the optics designers have to make tough calls as they trade off between you know fidelity in the center and we’ve got a flat screen and curved optics and it’s just hard to make make it work the proper focus all the way across the edges and trade-offs are made i’m but there’s some other things that are are not as obvious where chromatic aberration correction we do correction for this where the chromatic aberration you get your you put white on the screen and you wind up with red green and blue being stretched apart from each other and it’s not you know we do a good job now where you can still see a little bit of a fringe from it but if you look at the uncorrected screen like if you popped the lenses off of the head mounted display uh like a little letter a near the edge of the screen the chromatic spread is so great that it’s not one letter blurred with offset color bands you have a red a a green a and a blue a that are actually completely separate from each other they’re spread that far apart and because these i the filters on the lcd are not perfectly chromatically the corrections that we do from this does mean that while we’re able to mostly smoosh everything back together there’s still a little bit of a smear between all of those regions that winds up acting as a little bit of defocus for things now i this is also why subpixel rendering just doesn’t work on vr like the font rendering techniques that people do on desktops to go ahead and get kind of clear type sub pixel uh like red green and blue independent rendering it’s pointless to do on vr because we do not resolve chromatic aberration down to this subpixel level you know you can go ahead and say well i’m going to render something different for red and blue but it’s going to be moved around based on how you’ve got the headset on your head you know versus you know how wide your eyes are what you’ve got set up like this is something that you can do in your in your own headset where if you look at a screen you look over in kind of one corner and look at how much the fringe is there and then just adjust the headset a little bit and that fringe will move a macroscopic amount like potentially a pixel or two depending on where your eyes were of course if you don’t have perfect ipd for whatever it’s set up for that becomes more of a problem now this is possibly something that eye tracking could help with uh in general i’m not as bullish on eye tracking for foveated rendering as most people are uh you know it’s exciting this you know you’ve seen the demos with the raytrace stuff where you look over here and you can throw away 95 of your pixels for a bunch of reasons this is probably not going to work out as well for our conventional headsets with conventional rendering and the way we can do foveation but interestingly this might be a possible way where even if you don’t have super fast reactions if we’re able to tell where your eye is relative to the lenses we still can’t correct focus but we could have theoretically unique chromatic aberration corrections for each uh you know each eye position and that could let us claw back some of this the point that maybe we could do sub pixel resolution and that lets us in that small sweet spot where the optics still are good enough we could call a little bit more resolution out of it i am now there’s another interesting kind of techie bit with quest 2 where i was so proud of the hack for using the the display processing unit the dpu to do chromatic aberration on go and then quest uh you know it was this wonderful way that this part of the hardware that we really weren’t using for anything could do this uh part that took up a significant chunk of the gpu and we got a little bit blindsided by

one of the internal details on the xr2 while for almost everything it’s just a super set and better but there’s one tiny little thing in the display processor where somebody must have gone nobody uses all of these channels and some of the channels got a little bit downgraded so our old scheme for doing the you know multiple windows and stretching the red and the blue separately from the green no longer worked directly but one of the engineers made a fairly heroic fix late in the game they used yet another feature that we weren’t using of display right back to memory that allowed us to go ahead and composite things together and still get our display mcac it’s wasting even more main memory bandwidth to the point that i’m really wincing about it but it turns out these chips really have more main memory bandwidth than most of what the gpu and cpu used so it’s still kind of working out okay for us so i if we are optics limited where do we go from here you know there’s different optical things that future headsets could have where you could have more complex optics you could have doublets or i you know many of the kind of the ancient age uh head-mounted displays when they still had terrible displays before rift some of them had crazy insane optic paths some of them had literally like a dozen lenses they were designed by like microscope designers or something that would just have all of this stuff that would give probably this perfectly square they didn’t even do distortion correction so they would go through all of this it’s possible to to have flat focal fields and notice you know very minimal chromatic aberration correction you can get a lot of this if you’re willing to do exotic optic trains like that but you know they were in those cases they were glass they were heavy iron and one of the things i would worry about consumer-wise is i drop testing where if we were doing this in the normal auditorium i would ask for everybody to raise their hands it’s like how many people have knocked one of their headsets like off a table and had it clattered to the ground you know and wondered did i just knock something out of alignment now in our current situation the most hazardous thing is the cameras i am you know the lenses with a single lens is in pretty good shape but on quest cameras getting knocked out of line but that was one of our real concerns with drop tests and we’ve got various things with dynamic calibration going on but if we had an exotic multi-lens optic system that might be a problem you know there are other exotic optics with pancake lenses and multi-bounce polarized systems that they can offer some more robustness and possibly uh you know higher quality at the exchange of possibly having some other issues with ghosting um if we don’t wind up getting better optics and we still wind up with something similar to our current fresnel systems then it might turn out that displays are maybe better served by going to high dynamic range doing some of the things with localized dimming and rescaling everything and having brighter areas because even if we didn’t get more fidelity that could still make better experiences and have some good benefits there another one of the things that i am is not obvious that winds up limiting us on our current resolution and fidelity is that the tracking cameras are still basically the same low resolution that we had on quest and it’s surprising that these low resolution cameras track as reliably as we do they are very sub pixel accurate pretty deeply sub pixel but it still does wind up that if you hold something or if you have something you move up so that it’s a couple feet away from you in vr and it’s presented at the peak quality like my best time warp layers and everything super sampled and you’re looking at that and you’re close to it you will notice as you’re reading carefully that everything is slightly jittering around and that’s because it’s at the limits of our tracking precision uh now various things you work around this where if it’s a giant screen on a billboard far away then it’s not a problem because the translational rotations uh you know our our attitude change is super precise you know the imus are very very good it’s the tracking that we base off of the optical cameras that has a limit to it um so you know there’s that implies design things where people want to have things that they interact with in arm’s distance but that has problems both for the focal stuff without varifocal uh but also this kind of it’s at the limit of tracking and you can have problems with that but of course most people that are just rendering things into the world they’re still not at the absolute limit of quality and it’s it’s more at the level where it kind of works out now but we are absolutely at the at the edge of what we want to do with that and whatever the next headset is we got to get higher resolution on the tracking cameras so ergonomics wise we are a little bit lighter a little bit smaller it’s still not quite quite where go was but better than quest was i the great thing is we have these accessory head straps uh we have them the hard the rigid strap and then we’ve got the battery counterweighted one which a lot of people have wanted for a long time where it’s more weight on your head but the counterweight winds up unloading it

off of your face more so that should hopefully be for most people the most comfortable headset we’ve ever had but there’s still so much room for us to go there where i you know i still want to see ultra lightweight headsets where you just you still don’t want to be in vr for hours at a time right now and as we start going to productivity applications you you really need to sort that out and also as you go to ultralight there may be these synergistic benefits where you make something that fits more like glasses than you put on nose pieces and so that you can locate the lenses much more precisely to the eyes and it can help with our kind of optics limited uh out of resolution there but i’m you know we’ve got there are very strong opinions about what can work whether i’m wired versus wireless i i still tend to think that there might be useful things with wired cases i keep pointing out billions of people have used wired headphones and gotten value from them now obviously a thin little health headphone wire does not carry all of the data that we might from a completely separate compute puck of some kind but we did look at this for the latter days of gear vr doing this kind of two-part plug-in instead of drop-in and there may still be some some useful things to go there i am on converging with go you know there’s there are still some things for which go is the best headset where if you just have something where you want to look at a an immersive video you want to set something up at a display and just put it on and just you are there magically everything works right now you put on a quest and you usually have to acknowledge guardian we still fail more you know too often where you have to wind up resetting a guardian you may have to acknowledge things inside your area we have you know moving towards like selectable users all these steps that you go into before just have everything appear in front of you and this is something where it’s still kind of a glitchy mess where you put on the headset it usually blinks in some scene of where you were and then guardian comes up in some way you acknowledge something maybe it flashes up a bit of shell before showing a separate application and these are hard to track down now when you might have literally four or five separate teams at the company that have to coordinate and make this all happen so where again i kind of beat the drum of we could integrate more of this and make sorting these things out making them perfect a lot more easy i so eventually the pitch for that is eventually putting on the headset should be as seamless as answering a phone call because eventually you might sort of be answering phone calls in vr if we get to where we want to be with communication you want to be using vr to communicate with people you want to be able to you know be page put on the headset and just immediately be there you know every second counts counting this down as does it take 10 seconds 15 seconds to get back to where you were this all really matters for the experience but having things converged now on our vr platforms is an enormous relief it’s really hard to overstate how much i you know how much drama internally this has been over the years where you know my vision for vr was always as this universal device you know we should be able to play games we should be able to browse the web we should be able to do productivity things we should be able to connect to a pc to cloud services all this you know it’s virtual we can do anything it should be universal but i you know a lot of the most of the other founders were really about we want this high-end awesome gaming system and this caused enormous tension through the years and it’s kind of ironic how we wound up with a system where we have this low-powered gaming focus device which wasn’t really what anybody was aiming for at the beginning but it’s doing well for us and we’re clawing our way back towards universal platform in various ways and the you know it’s great to have a team that is really kind of all pulling in the same direction now and we are getting people that have shipped a few headsets now we know what we’re doing and i’m always pushing for go faster you know i’m never satisfied with all this there’s so much more that we can do but we’re at least you know the derivative is in the right direction now we are making some progress with it now i there’s a place for high-end headsets where people some people are disappointed that we’re shipping this super cheap 299 dollar headset that’s amazing because they want a thousand dollar headset that has every feature you know throw in the kitchen sink and everything and i think there is a place for that but i do always caution that there’s a hazard even when money is no object these things have costs you know we money does not fix our thermal problems it does not make cameras weigh nothing you can’t just throw all of these things in even if you’re willing to pay whatever uh you know in in many ways quest 2 can be the best headset you can get you know that money can buy right now there are some things that we can do better but not as many as you might think uh there’s not many screens that would be better than what we’ve got you know there’s not that many sensors

in some ways but still in the same line i think there’s the possibility of of having a low end and a high end you know as long as they are the same line they’re the same software i’m that it’s not something that’s really competing with this so uh you know i mean i would love to see super expensive stuff that’s the types of things that you know that i would buy but i think we’re doing the right thing about concentrating on uh you know broadening the market you know it’s great that we have fifteen hundred dollar bf gpus available on the pc but that’s only possible because there’s 99 video cards that enable an ecosystem and i think that getting the the more inexpensive systems out on vr is critical and then eventually we can have our super high-end boutique things and that’ll be great and wonderful i’m now the controllers are one of the things that is a real anchor on the cost i am the controllers are you know you buy a high-end gaming controller like a high-end xbox pro controller and you can be what’s like 80 90 more i am our controllers aren’t quite that expensive but they are a significant chunk of the bill materials when you’re looking at something like quest 2 so of course we are thinking about how can we make things that could fill exactly the go niche of things that are used for you know media location-based things and say well can we make something like this that works seamlessly without the controllers and you can see the steps that we’ve been taking towards this with you know with hand tracking and with voice control and it’s it’s not there yet but we can see a path possibly to doing that where if we have all of these other applications being really valuable and functional without the controllers i mean when you get a laptop you don’t get an xbox controller shipped with it because there’s just lots of things that people might want to do that don’t need those types of controllers and i think that that’s um you know that’s a really valid direction uh you know there’s also the exotic things like possible brain computer interfaces i you know the non-invasive stuff about even if we could tell just one bit with a brain test i think that could be really magical if you could be looking around and just essentially have a brain click option where you know eye tracking possibly combined with that that could be a magical thing you’re just looking at things and things are happening as you wish them you know the the latency on the uh the non-invasive brain stuff is not what we’d like for precision control but it might be possible to get there and you know and that would be you know pretty neat so our current controllers uh you know we call it our constellation tracking system with little leds on the controller that winds up uh driving a lot of our decisions on the on the systems where we use the same cameras that are our global headset tracking also for the uh the handset for the controller tracking and we alternate frames and we have different exposure settings for it and that winds up being a then we have to have yet another exposure setting for the hand tracking so we have to juggle between all these different things and unfortunately the controllers are they’re carefully calibrated so it’s hard to make third uh third-party uh controllers in different ways for it so there’s lots of different possibilities you know valve does the lighthouse tracking which is an external system sending uh signals out that the controllers get which has some advantages for being able to track behind your back and so on um there’s possibility making cheaper controllers where instead of having the active leds which cost more than you might expect i am on the controller bill of materials you could have completely passive things with different shapes that instead of tracking the exact dots it’s more like the hand tracking where anything you do with a known shape thing is going to be far easier than hand tracking tracking hands is hard all the different ways that positions and poses hands get in it is a tough problem i mean if you say well you can apply all of these resources and here is the exact cad model of the shape that we’re going to track that could just be much much much better and you could still have an imu in it at the cost of a go controller so there’s possibilities there to make much less expensive controllers and then there’s possibilities for more expensive controllers that track themselves you know you could basically put cameras and tracking inside controllers and then they don’t care where they are relative to the headset and that brings a lot of flexibility so maybe you have a you know controller free skew but the controllers are more expensive but they’ve got they never lose tracking in terms of being behind your back for too long different possibilities like that i am you now the grip fit on our controllers is something that i do think there’s room for improvement for where they feel really good in your hands uh they feel like i you know like console game controllers tons of ergonomic work goes into this and you’re comfortable holding them for a long time but the difference is you do not sling your game controller around like that so much like you do when you’re playing beat saber for a half hour and eventually you wind up kind of making the claw and hooking your finger over the top of the controller and that’s just not ergonomic i you know i’ve tried putting friction tape around the controllers and it’s a huge mess when you have to change batteries but it seems like there are you know there probably are better designs i’m saying we should look towards sports equipment and

tactical combat equipment things where performance really really matters and you do not have kind of you know the grip of a gun shaped like a bar of soap that’s going to squirt out of your hands uh when you squeeze it when you’re sweaty so i think we’ve got but interestingly i was told that there are non-trivial industrial design cost issues with how you set up molds to make kind of better grips for things like that that you know that may impact us but i think that we will have some improvements kind of going forward i’m you know i wish that it was easier to make real third third-party controllers for this where you know i’d love to have something i i’d pay significant money to have some custom like renaissance festival fantasy designed i you know carve them out of brass with shark skin grips make something really cool that are weighted for me on my heavyweight beat saber days you know open up a velvet case and pull out two awesome controllers for vr you know that’d be cool but it’s just not really possible right now you have to make these holders around the controllers which kind of defeats i you know it limits what you can do for that there’s still interesting things being done but one day we’ll have all this to the point where we can make that kind of work a lot better now the the haptic situation is something that makes i wasn’t a huge fan believer in haptics before but it’s been interesting where we had a build where haptics were broken playing beat savers like wow this really does you know i miss that it makes a difference and we just have the most trivial kind of buzzy motor thing right now i thought that it would be interesting to be able to have something that’s a lot stronger where you know for the punching things if you could wind up basically pull back a spring and then let it go for a real hard smack into your palm i think that could be good for intersecting physical systems in a way that you don’t get with just kind of the gentle buzzing and and maybe we should put haptics into the headset where some people theorize that little buzzing on the head actually can help with simulator sickness but it could be another feedback for you know you just got popped in the face in the boxing game or you stuck your head into the block or into the wall in one of the i you know in any other game so there might be some useful stuff there and then i you know tracked keyboard and mice are you know we’ve got we’ve got a direct partnership to do a specifically branded one but it’ll kind of work with a lot of things and that’ll get better as we you know as we get you know more experienced with all of these things and there’s a lot of trends that are pushing together here with hands body keyboard environment intrusion detection it is all about this the the headset learning to understand the world around it and we have lots of teams working on this and this is some of that big deal long-term technology about machine perception and understanding the world and figuring out how we can use it inside vr there’s you know there’s a lot more yet to come there i do worry sometimes that we’re in the position of pushing technology into products without the products pulling in things that they need there’s a real hazard in a the fact that we are power limited we care about every watt that’s going different places uh the fact that we could have some of this deep machine perception stuff that really does not carry its weight for the experience value to the user not something we have to worry about but that can be a tough fight when somebody’s had a big team that’s been researching for a long time and now it’s time to go into production to be told it’s like well maybe that’s really not justified and we’d rather spend that lot on our game applications or something i just you know they’re one of the internal things we have to deal with i won’t spend much time on media this year i know i’ve spent like half the time talking about it in many previous years but it hasn’t been our push on on quest it pulls a little bit back more in with quest 2 as we’re trying to subsume the i the go market for different things but we know exactly what to do we just haven’t done it all right yet and there’s not a single application that’s really absolutely nailed everything that’s got color space resolution and coding tempo all of those things done perfectly but we know how to do it i’m during quest 2’s development i got some of the final footage from nextvr and before they got swallowed up into the belly of apple uh their very last camera rig showed some just absolutely breathtakingly good-looking footage and there’s the on the the quest 2 screen when we’ve got things encoded at 8k encoders sliced up in however best ways to get our best frame rate resolution pushing pixels around it really does look amazing but when you go back to it’s like well what do we wind up just streaming uh you know we have limits on what we can stream but i keep coming back to we could be streaming in most cases and for many people for even 10 times what we do right now like so many of our streams are 10 megabits where we have lots of people that you’re on connections that you go ahead and you set up by like a quick connection or multiple tcp connections and you can pull 100 megabits down you can do some of these amazing things but almost all of our stuff is still down at 10 15 megabits

so one of the big things that i’m looking forward to is kind of the great re-encoding where we take all of the stuff that we’ve got and redo it with exactly the right codec parameters the right resolutions the right bit rates and make it all available so yeah we still have five megabit crummy things although immersive video is really not even worth watching at those low bit rates we should just you know stop and say you need to get a better connection but stream all the way up to you know give us 60 70 megabits if it’s available it really doesn’t hurt us to have that there we can scan ahead and make it possible and it’s just it’s better than what people think from what they see right now and that’s been you know really frustrating another thing our browsing experience is still terrible you could be forgiven for thinking that we’ve only got a hundred videos and when you kind of go into tv and scroll through the pages maybe click see all in a few places but we had there was an internal post where somebody said it’s like oh we have this many hundred videos that are above our quality bar and i was just like no this can’t be right i know that there are far more than that and eventually they were found there was a mistake in the query it turns out no we have 7 000 videos that are above this quality bar we need to make them discoverable we need to make them all the best presentation and get it out where people can see it uh the quill theater was a surprise hit i it was quill designed on the rift is an extremely expensive rendering system where it’s all about these lots and lots thousands of strokes from the artist just the way you go in and the speed you get from that is by doing this kind of sketchy way of doing things and it is hard to make that run at full performance on uh on quest but the uh the theater they put they did an interesting uh interesting thing in there where they have this hd button where you can have it go in this cut down way where it’s generally kind of holding frame rate but if you want a closer look at something you can press the hd button and it will usually start chunking down and missing frame rate but it gives you a little bit of a crisper view but it turned out that this sense of um this kind of magical fantasy world of sketched up by artists with good audio and simple animation and the ability to to peer around like through the dollhouse at it that there was there was some real magic there and it seems to resonate a lot more with people than a lot of the even highly produced immersive video content so that was a great data point but i on the conventional media side of things fandango now i is kind of one of our best quality presentations you know we finally have essentially all the movies they’re available it presents this is the thing i point to where this is a a 720p screen but it’s presented with super sampling with the right color space it’s doing basically everything right there and the screen looks really darn good and of course in quest 2 it looks even better because there’s no super sampling effect out at the edges it is all you see every pixel every pixel matters it’s all within the sweet spot and i you know i want us to go to streaming 1080p especially for um for 3d movies where we have to cut it in half and you only get really half the screen so at that point you’re at you know 960 by 1080 which is definitely even less than what you could show on quest with super sampling and with quest 2 you know we could use even more but it is getting to be a legitimately good movie viewing experience it’s a little bit more comfortable much higher quality everything is there and we’re supposed to be getting essentially all of the 3d movies you know we we have not gotten them all rolled in but i’m really looking forward to having that was the promise from day one on vr of course you want to watch a 3d movie uh unfortunately that like the sign up flow and everything is still pretty horrible you have to go through and make a new account there register on the web browser before you can get back in but once you do it’s you know it’s a good high quality experience so boy i am almost out of time and gee i’ve got a lot more to go through but we’ve got our push to be a general-purpose eye computing system and all the things that we’re doing in the shell environment for this where we’re adding kind of free resize you know right now we throw the things up in front of you but as we move towards a multitasking system where we’ve got multi-tab browser multiple ones popping up you can start using that for legitimate stuff you know i like i sat down trying to do some of my my ai research work i all in headset i turn i critically need a pdf viewer so that’s something that needs to show up in our browser before it can take over some of this work for me but you can set up some pretty good areas the higher resolution matters and then adding the freedom to be able to nudge things around like some people want things more down low versus up high stretching out the different things we still need to have the ability to go into portrait mode instead of landscape mode but we’re getting some real values there and it also works cooperatively with sort of our android applications like fandango or the other things there and that’s still one of the things that that absolutely kills me where i think we need

more android applications i am and we do not have a sorted out strategy i’ve got a long spiel about this that i’m not going to have time to get to but we have all these existence proofs and examples of you know microsoft tried really really hard to move all apps to a brand new eye system and it just doesn’t it doesn’t work out i don’t think it’s going to work out for us i think that we need to support our android apps there in a broader sense i we have progressive web apps as the backstop for everything but on the mobile platforms the the progressive web apps they generally lose out to native applications and i we care more about performance in vr than mobile systems so i think we need a solution there and we haven’t sorted it out you know we’ve been in this situation where we have react vr running for our internal applications but we haven’t been willing to really productize that and put it out we have ways of doing direct low-level access to the panel apps like browsers written in a different native way we have android apps we’re projecting for some of our settings different things for aui and i am getting our our time limit here so i know this is not ideal but uh after this i’m going to go into the horizon beta i am i know everybody doesn’t have this available and we don’t have large seating availability there but there’s going to be a connect with carmack um i world there and i think they’re going to be able to let in 20 people at a time i’m going to go in there and i’m basically going to go till my headset batteries die i am you know well whoever can get in i you know make your way up happy to answer questions it will be very much like the hallways afterwards at connect and i’m hoping that next year we can have this sorted out to the point where we can have the venues presentation we can have everybody there we can have people making their way to the front we can migrate through all of this and have it be a seamless experience so we’ve got you know we’ve got a north star that we should be shooting towards and i think i’m about done here