Sponsor Session – Pixotope

4:25pm in Studio A

Pixotope: Flexible Virtual Production

This session will include discussion of Pixotope’s flexible approach to virtual production with the ability to allocate engines for XR, AR, or Virtual Studio and easily move licenses between systems depending on the needs of the production. Additional flexibility comes from tight integration with Pixotope camera tracking to make XR setup easier and the ability to use multiple methods of camera tracking with the same base software, including markerless TTL (through-the-lens) tracking for quick turnarounds. The lack of proprietary hardware, cloud-readiness, and a new mobile phone application for education make it easy to produce virtual content almost anywhere.

Download Transcript

Laura Frank  00:12

I’d like to introduce Andy and Tanner for, from Pixotope everyone. We’ll have a visit with PRG after we hear from Pixotope. Thanks guys.

Andy Smith  00:28

I want to start off by saying a very special thanks to framework as well as XR studios. Can we everybody just give them big hand? Absolutely for hosting this event. This has been wonderful so far. My name is Andy Smith. I’m a regional sales manager for western North America with Pixotope. This is Tanner Woodward. He’s our sales engineer amongst many other things. I just want to go through just real quick a little bit about us. So Pixotope was founded in 2013. As a virtual production company. We did a lot of creative work. Then we switched over to developing our own software out of a show that we did call last time based out of Oslo, Norway. And in 2019, we successfully launched pitstop 1.0. From there we grew from what was known as the future group to rebrand ourselves as Pixotope technologies and 2021, which then brought us to in 2022, acquiring a company called TrackMan, based out of Cologne, Germany. That’s just a little bit about us a little bit background and in 2023. Now we’ve got clo cloud, mobile as well as a few new markets that we’re branching into a little bit about us. Our why, why we exist. You know, all video based Content will eventually have visual impact and high end feature films combined with the connected and scalable nature of online Content, and the Immersive and social experience video games, virtual production will become the mainstay Technology that enables these new types of experiences and will become indistinguishable from media production. Our Why is to make virtual production more accessible. We see a community being built and we want to drive to build that community even deeper, and broader. media creators will need reliable and sustainable platforms to build their business on as well as software in replacing both appliances and professional services as the performance of choice. Here are a few of our customers. As you can see, we’ve got a few here listed CGI studios right here in Hollywood. We’ve got silver spoon animation, as well as the famous group as some of our creative houses that use picks it up. And most recently, of course, Riot Games butcher bird studios based out of Glendale, right here in this area, ESPN, touch to name a few. And now I’m gonna hand this over to Tanner so he can dive a little bit into our solutions. And a little expand a little bit more on our why.

Tanner Woodward  03:11

Thanks, Andy. Yeah, like Andy mentioned, my name is Tanner Woodward. We’re part of our sales engineering as well as our solutions and kind of technical support team. So going through our solutions, we offer a number of different kind of diverse solutions that will basically be powering VP and allow our users to be as creative as possible and as flexible as possible on site. A lot of this work comes from our kind of wide variety, our diverse partner set that and you just mentioned before, like, like I mentioned, famous group Silver Spoon PRG a lot of our customers that we value highly we couldn’t be here if it were not for this some of the amazing work that they’ve done. As you can see, here are some of our augmented reality samples, our virtual studio samples, and our XR samples, which is kind of the latest addition into the pixel platform. We were all we were always doing XR before but now it’s fully integrated into our software selection. So here’s our total portfolio that we’ll go through and just move this a little bit closer. There you go. So pics dope as a virtual production platform consists of these two hierarchies, right graphics, and now tracking the in the graphics pipeline, you can see Pixo VR, and AR, as well as picks up AR and picks up CG, which we’ll be introducing later in 2023. Slash early May, perhaps early 2020. For these make up our graphics pipeline, as well as what you can do as users to expand your virtual production workflow. The tracking side consists of four main components that you can kind of use interchangeably as well as use them independently without as as as you would need. The first being picked up fly, which is strictly through the lens which I’ll expand upon the pixel marker which uses retro reflective retro reflective markers to build to determine its position in the studio or environment that you have the tracking within pixel vision. We which is personally one of my favorites is what is it’s a unique approach that uses both marker and IR, a marker and real world feature points to simultaneously create looking at the Camera no matter what environment you’re in. So whether you’re in studio or if you’re outside, or if you need to use a combination of both, moving the Camera from a marker marker position over to where there’s more real feature points that it can triangulate and locate its position from. And then we have pixel vision, which is with Ghost track on a partnership with megapixel and row that utilizes the row panels with a Helios processing system to put in panel markers that we can then track off of within an LED volume space. So they’re going through the graphics first. So pixel vs and AR edition provides a full toolset for virtual set and augmented reality rendering. And the software software licenses are available in subscription or in permanent form, as well as in event form as well. So what’s unique about our approach to virtual production is that we are entirely cloud based from our graphics platform to our tracking tracking platform. So all of the licenses can be issued on the fly to any machine to any user, you have an admin that runs that runs all of the all the licenses that you need. So if you keep machines back at HQ, and you need to send them out to certain jobs or or around the world, wherever they’re needed to be, you can assign licenses on the fly as needed per show. So this helps you save on shipping and upside down any kind of, you know, in transit dongle base mechanics that might happen with with the with those physical dongles allows you like I said, to be flexible on site in production, to bring in new engines if something fails, or to basically just keep production rolling and not be bogged down by a physical licensing system. So some of our key benefits, its it of our Pixotope platform, and the Vs and AR world is that’s an intuitive UI, it streamlines the use of the Unreal Engine render pipeline, we stay we stay as as current with Unreal as we’re comfortable with to ensure that stability of the engine is that it’s our our foremost is one of our foremost priorities, ensuring that things aren’t breaking that things aren’t going down that you that we just keep production running. So we are currently at five one matching with Unreal, we have no proprietary hardware, so it’s off the shelf so you can completely build your own engine. as needed. We have we probably have provide a multitude of diverse spec sheets that you can then reference to make sure that your machine is performing as optimally as you require it to be. It’s multi threaded process so multi threaded to process video separately for higher quality and lower latency. For video pipeline control, high quality software care that’s all engine in software. So you don’t have to rely upon any externals if you need. We do accept external kirs without issue. But we do have that internal one if that’s if you need to have a more flexible approach to Kenya Kenya graphics, single talent tracking for AR without Sensors. So so we have the ability to track a single single subject on onstage without using any kind of Sensors or using a witness cam just strictly through the lens in software, asset and project synchronization between engines so the ability to add assets to certain projects built to push those changes live all within a centralized asset hub within pitstop software itself so you’re not having to reload or reassign assets as needed per project independently, you can master control them to be able to push them all live as as needed per engine at once or separately in divided diet full diagnostics view so you can see you know, networking connection, everything that you need, how engines are performing failover mechanics so that you can make like said keeping uptime and keeping productions running, health monitoring kind of going along with the diagnostics ensuring the machines are running effectively and safely. Our XR addition as I said was one of the latest additions to be fully integrated into our pixel platform. And it’s as a comprehensive comprehensive toolset for setting up and rendering to LED volumes and includes all the functionality of your vs and AR license as well. So you so one XR license not only gives you the ability to render to xr and have those control and functionality as you would expect but also gives you the the virtual studio and the AR as well. Yeah seamlessly combined multiple virtual reduction techniques yes set extensions AR elements you know allowing you to be as creative as possible and not limiting you based on the software or hardware right Build Your Own it build your own engines adapt engines you already currently have our machines you currently have load license on there and get to work. One of the key benefits of the XR addition is the our auto color matching feature. So pointing the Camera at the wall and clicking a button initiates auto color matching process to create a lot for the wall and AR set extensions that matches so simple. It’s a simple process that allows you to select said keep things fluid, keep things not not looking as natural as possible within your environment. Input switching So switching between cameras on the frame with different perspectives on the wall based on tracking data, the new Camera for multicamera XR was set extensions right so being able to input switch live, be able to get the perspective that you would expect while still maintaining the fluidity of the actual perspective of the wall shifts in the in the restaurant digital twin is one of the a feature that we’ve adapted using our tracking system that we acquired. And this is done by using our using the sensor to then we first The first step is we take the wall, and we throw up basically pre calibrated panels onto onto the onto the wall that matches the exact dimensions, the exact pixel pitch the exact ratios and dimensions of your wall. And then we use our tracking sensor to then just basically rotate around the wall to scan. And what that does is it automatically creates a digital replica of your wall. So that cuts out your alignment time and it saves you costs saves you time, saves you money, and saves you a headache of having to go through a manual alignment. Yeah, we also have the simply timecode certified sync and mapping. So no, you’re not using and display nodes. So you don’t have to have control and render nodes. Each engine operates independently. So you can assign it as a control and basically use that as also rendering to the wall. This is yeah, this gives you the ability to map specific engines to the screen and it’s all completely frame accurate. Tracking. So pixel fly, as I mentioned is are entirely through the lens solution. And what this is doing is it’s a it’s using machine learning to analyze the environment that your lenses detecting automatically. So any Camera drone you know dollar Camera, it’s cable can’t see no cable, cable Camera, anything like that, that you can use now, environment without having to use a witness Camera to track and build your point cloud. You can do this all independently just with BASET. Whatever your Camera sees, it will detect it does this by looking for basically points of contrast, and feature points for it to tether onto or anchor on to lock on. And basically later you set your origin point all within all within frame. And you can just run your tracking from whatever Camera without being tethered to witness cam. We also offer 3d living photo which is Volumetric photo AR photo perspectives that change the Camera movement, you know, so you utilize a turntable photographic capture of the photo subject to then, you know, add movements, add dimensions and add you know quality into your living photo. So the pixel mark and pixel additions utilize our witness Camera. And that’s that’s tethered either below or above or to the side or whatever angle you need it to view and that’s between markers able to look at is IR IR based. And specifically, we’re looking for retro reflectors that you would put where ever you have space to put them. And it’s basically it’s using those as points of have to not only create feature points but to tether an anchor onto. And then we have vision, which utilizes a system of both right it’s not only looking for not only has the ability to locate those markers and use those as feature points. But simultaneously, it’s using that same contrast based detection that we talked about with with through the lens to then determine exactly where it needs to be at all times. So if you move the Camera beyond a place where there’s markers, and you have crossbar, or trust or ceiling, taut ceiling tiles, whatever in your environment that you need to use for tracking, you have the ability to track off and not be totally reliant on either only edge bass are only for only contrast based feature points or just markers you have the ability to do both. And then our XR tracking is we use high high frame LED panels to display tracking pattern. And like in simultaneous video backgrounds. And it requires the ghost frame as well as the row visual, the row LED panels and the megapixel, Helios processors and that that gives you the ability to put markers in front and in panel to then detect and be able to build your tracking off of and use that without having to rely upon markers or physical feature points. Talent track is our presenter tracking system. So with this one we are able to do for four presenters at once one time, and we use that with a series of witness cameras that are located in the stage to then track people automatically as they come in and out of frame. And then this is the Coming Soon part is our production graphics. So our production graphics and full newsroom workflow, which we’re very excited about to be announcing later this year. Now give us data connect data connectivity, moss and automation within your CG graphics. And I can’t go without mentioning our 24/7 global support. So we’re very proud of this we have about under we have under an hour response time and we can we have it follow the sun mentality. So wherever you are, whatever time of day, whatever it is production that you need. We have support built into the licensing system so that whatever you need help with, we’re always available to ensure that we just keep production rolling. Thanks, guys

Andy Smith  14:34

and now real quick, I’m just going to dive a little bit into our picks job education program. So Kevin Cooney was quoted as saying that the industry is moving at a such rapid pace that it’s difficult to find talent who can creatively problem solve. We can build as many virtual production studios as we want, but without a creative talent behind them. We’re at a massive disadvantage. The pixel job education program you It was developed as a response to this talent shortage and virtual production is specifically designed for higher education institutions to help enable the next generation of virtual production talent. We have several components within our education program, access to virtual production tools, training certifications, we have a new app called pics toe pocket, as well as access to the industry and access to expertise. It comes with the same fundamental aspects that you would get in your normal pics taupe licensing structure with a virtual set and as well as our AR edition that Tanner has mentioned earlier, as well as the full comprehension the full comprehensive toolset of our XR edition. At the heart of our development process lies the dedication to accessibility and reducing hardware dependencies. With that mission in mind, we have poured our utmost efforts into crafting the pivotal graphics XR addition to offer seamless AR XR workflows addressing the challenges in XR and simplifying the techniques like set extensions, while ensuring high image fidelity. And then of course, we also offer pitstop truck tracking in our educational component, which comes with all of the tracking features that Tanner mentioned. We have partnered up with the University of Gloucestershire, over in the UK, as well as Husson University in Maine UNCC here in the States, and we’re in discussions with University of Southern California. We have training support and certifications. As a part of the education program. You know, we want to make sure that everyone who’s using pitstop is trained and certified. You know, we’re passionate about building this community for one purpose and one purpose only. And, you know, change does not happen in isolation. Change happens within a community. And as we’re leaning towards the future of virtual production, we want to offer our best to these young minds who are up and coming in college and looking to excel in their careers after graduation. Pixotope pocket is our latest addition to the Pixotope education program, making virtual production education more accessible and affordable to students. Each student can now have their hands on their own version of pixel taupe. And they have a tracking solution and graphic solution in the palm of their hands by the use of their phone, where they can now go and train themselves within their dorm rooms, complete assignments, and also help build a community within themselves and amongst other universities around the country where they can communicate back and forth and build off of one another. Which brings me to our final point here is our industry connections and expertise. The pitstop education program is quite unique, is a bit of a unique initiative that aims to bridge the gap between higher education and the industry. It is a valuable resource for an educational institution looking to provide students with the latest knowledge and skills in virtual production are creative partners such as the famous group Silver Spoon animation, just to name a couple. These students will now be able to reach out to these experts experts in virtual production to build their future expertise as well as knowledge off of their will able to rely on these, you know industry creatives industry operators to be able to build a better future for themselves. And with the pitstop education program we provide them with that advanced knowledge and sets of tools to be able to make sure that they are putting their best foot forward after graduation for a largely successful career. We thank you so much for your time today. Thanks for sticking in with us. I’m going to look at my heart monitor when I was when I walk off the stage and see what my BPM is look like now so

Laura Frank  19:03

thanks, guys. Thank you Andy and Tanner

SUMMARY KEYWORDS

xr, tracking, virtual, production, ar, pixel, engines, tanner, wall, markers, build, graphics, based, camera, software, points, ensuring, studio, solutions, latest addition

SPEAKERS

Laura Frank, Andy Smith, Tanner Woodward