Sponsor Session – Megapixel
11:25am in Studio A
The Future of Visual Technology
Visual technology is rapidly evolving. From the early days of LED processing to the latest advances in lighting, video, and cameras; everything is now becoming a part of a network. In this session, we will discuss how these technologies are converging to create a new ecosystem of visual experiences.
Laura Frank 00:12
We’re going to move forward with our sponsor presentations. I’d like to introduce Jeremy Hochman from Megapixel. Please welcome Jeremy.
Jeremy Hochman 00:24
Thanks very much. We’re going to talk about the future of visual Technology. And obviously, the future is very bright, we’ve got a lot of topics. So I’m just going to touch on just a few of these things, SMPTE 2110, some interoperability of vast array of different types of equipment on many of these live events. For those of you that know us, already, we’ve launched a new logo. So I’m showing this here at framework for the first time, you’ve got the first sneak peek. For those of you that do not know us, I’d like to just show a very brief overview of some of the visual Technology that we bring into the market as Megapixel. So those are just a few of the visuals of the types of projects that we work on, we do a vast array of everything from development from lens to pixel, and everything in between. So visual Technology is rapidly evolving. From led processing in the early days, to lighting to video, everything is now part of a network. And so that’s a big topic that I’m going to cover today. So we’ll discuss technologies and their convergence. So one of the things that I want to talk about as the multiple pain points that we have, with some of these massive solutions that need to be created, I think everybody in this room has been part of large events, whether they’re alive, fixed installs, related to film and TV. We’re in a beautiful studio space here that’s taking very complex equipment, putting it together and making it operate in a cohesive way. But we need to move tons and tons of video data around. We need the the data to be bi directional. And right now in many cases, there’s too many connections and conversions that make things very difficult to work together. So how do we move data around easily networks and our Helios processing. So this is an example of a project that we executed a few years ago. This is MGM Cotai. This is about 300 million pixels in one space 27 4k rasters, we’re feeding displays that were about one to three miles away from the control room. So we had to leverage fiber infrastructure in a native way to be able to move all this data around. And we also needed monitoring and analytics built in because the displays were so far away from the control room, if you were in front of the display, and you wanted to change something, you weren’t near a control room to plug in. Vice versa, if you were in the control room and needed to do something in the display, you couldn’t see the display when you were making changes. So having the bidirectional data is very important so that you understand what’s going on at any given time. In the same facility, we also deployed a theater, this is the largest resolution in the theater at the time, 16k resolution. And all of these pieces are actually moving around. So it’s reconfigurable with motorisation, and even the seats move around to create like a theater in the round or a normal procedure them. And so this was sort of the beginning of that sort of spark for us that said, Okay, there’s going to be this 10 year transition between all of these disjointed technologies needing to work together and be networked. At this time we had tiles that were network devices, but the processing itself was very direct like end to end link. Proprietary fiber it was not network based. And so this is the path that we took starting megapixel was to turn all of these So long haul, signaling into into something that’s it friendly. So from our Helios processing to the tile, we of course have a data network. This runs over a switching network. So it’s standard Ethernet switches that are running our configurations. Every tile device on the end has a MAC address. And it’s treated just like a device on a network, just like a laptop would be plugged in your laptop identifies itself and says, I’m a Mac laptop, there’s Safari being used, and I’m browsing and I’m requesting something from YouTube. And everything happens automatically. We do the same thing. So the tiles actually identify themselves on the network. Helios reads all that metadata and says, Oh, I understand your xyz brand tile, you have RGB pixels, you’re this resolution, you have this driver type, I’m going to send you data in the right way and send you the right configuration so that you produce the correct color, the correct tone, mapping all of these things. So it’s a very nice, extensible platform that lets us do a lot of things. But what’s upstream of this because at the moment, a lot of people plug in DVI, they plug in HDMI, you saw earlier, before the presentation, you know futzing with all the cables, these things are really difficult, I wish I could just plug in an Ethernet cable and have video show up here. So we believe that 2110 is the answer to this. We have very extensive knowledge on Ethernet based protocols down to the tile, and we’re so happy to see SMPTE come out with a standard that actually provides high bandwidth signaling and metadata on the upstream side as well, because then that now allows all of these upstream devices to start talking to each other. So if we look at a traditional data path for some of these systems, we have footage that could be network attached storage, or coming in from AWS or something like that streaming in, we have servers, like disguise servers, or pyxera. We have video routers, and then we have processing. So all along that chain. Typically, it’s been cables that are copper protocols that are different connectors that are different. And these are all failure points that can happen because they’re all disjointed systems. And they’re all serially connected, which means that if there’s a metadata failure between the server and the router, that metadata now doesn’t make it all the way to the processor doesn’t make it down to the LED tiles. So they don’t know what colorspace they’re supposed to be. And you can manually override these things, of course, but that’s not the ideal intention, we really want all of this to be as seamless and automatic as possible. So with a network data path with 2110, all of these devices now just simply plug into an Ethernet switch, which basically becomes your video router as well. So in many cases, the router is removed from the system, or there might be a very specialized 2110 router. But we’re taking all of these data paths and bringing them into one central core, so that any piece of equipment can get any data that it needs. That allows us to also start thinking about consoles, API’s, other devices, lighting, tracking systems, and they all can plug into the same place. So yesterday, and we have it out here for anybody that likes to geek out on hardware. We have out in the lobby, we’ve released a 100 gig fiber module. For Helios, we’re really excited about this, to give you an idea of how impressive that bandwidth is. You can transmit a two hour feature DCP file in about four and a half seconds. So movie theater, you know really transmits that quickly. Not that we’re not that we’re doing file storage. But when you think about networks that are this fast, they have much more bandwidth than you’ve previously had. Which means that we can go beyond 10 bit video, we can go beyond RGB, we can go larger and larger raster sizes. And we can start doing things with the data that we could never do before. In our particular case, we ingest of course native 8k sources, but we also can stitch multiple streams together. So we can have multiple streams coming from many sources. And they can be put together, let’s say four 4k Is it could be even more HD signals. It can be arbitrary sizes as well. We support up to a 32k panorama image in a single ru which is which is incredible because when you look at spaces like this, very often times the width of the screen is much bigger than the Hi, these aren’t 16 Nine aspect ratios, we want to be able to use the screens in creative ways. And so we think about this as in terms of bandwidth, rather than thinking about it in terms of specifically 69, video rosters. And this is available, actually to order today. And it’s pre pre tested with Matrox, and Vidya and Unreal Engine. So very, very much easy to use and ready to go. So I touched on this a couple slides ago, a lot of people are focusing on these large LED surfaces, because it’s such a prominent part of a live event. But we need to think about the whole ecosystem as well, the previous panel talking about lighting was fantastic. And we want to bring all of these kinds of systems into this ecosystem. So lighting and Projection, and led and even fog machines and tracking, even comps and other types of servers, we want all of these things as part of one ecosystem, so that everybody is working in a in a similar way. So touching upon the intersection with lighting, and I’ll add to a little bit of the Image Based Lighting topic that was that was just discussed, we’ve co developed a tile with Kino flow, called the mimic that has RGB and two whites. It’s got a cool white and a warm white. So it’s producing basically the entire visible spectrum. And this is a really interesting example of how to leverage the convergence of video and lighting and doing things in a networked way. So with this particular fixture, it’s running our processing inside of it. But there’s a 3d LUT that runs that analyzes every single incoming video pixel and determines based on the color that’s being requested, how can that pixel use the five primaries to achieve the broadest spectrum possible so that it can illuminate people properly. And a lot of the testing that we’ve done is not just been about, you know, lighting somebody with a C stand, but also in these types of environments, how can we get lighting to match the video and the video to match the lighting. And so it’s really interesting, when you can take a lighting fixture that’s video driven, and have the I’ll take an example of like an orange ball that you may have rendered in Unreal, and you move the ball from a high res LED screen onto the light, and it’s the same color. And it responds the same on Camera, and it can illuminate a person in a better way than than the traditional RGB LEDs. So this is a really, this is a really, really cool thing as megapixel, we’re relatively young company, but many of us on the engineering side have worked in film and video production for 20 years. And so the first image based lighting production that we worked on was actually in 2004. And so for basically 20 years, we have a lot of experience with using video to drive lighting instruments. For film specifically, one of the things that we definitely understand is that these these types of projects have, let’s say competing control systems where we want to control something with a lighting console, we want to control something as a video pixel. Our mentality is, well, we should be doing both. If somebody wants to use a lighting console, please use that if you want to use a video pixel to drive it, please use video. And we’re able because this is all network based now, we can do both. So this particular fixture as an example, allows us to take streaming ACN in and video data in simultaneously. And you can choose to have the video data colorized basically as an additive or subtractive filter to use a console. And so that allows you to say, Okay, I’ve got video driving my fixture, but my DP want or my lighting designer wants to be able to actually change the color, we want it more green, we want it less screen, we want to make it brighter, we want to make it dimmer, you have those capabilities on a lighting console to be able to move things around the way that would traditionally be done in a live event. And we think that rather than taking away control, we want to add more possibilities of control and put those together. So we’re really excited about this. Another example of a project. This was in conjunction with super luminary and dot dot dash we were fortunate enough to be asked to help provide video for this. This was a a collaboration with Adidas and Prada. And we made the LED system for this but again what was unique was that because As Ethernet controlled with high bandwidth, the weight that could be put on the robots was, was very limited. And so there was an enormous effort by Super luminary to do a very, very lightweight carbon fiber infrastructure to hold the LED modules. And they created an amazing kind of deconstructed led tile, if you will, to execute this. But we also couldn’t afford to have massive cables and cable hods going through this thing to control it. And so by having the Ethernet control, it was very slim, very lightweight, minimized cabling infrastructure. And then, of course, we have a facility like this that XR Studios, where everything’s connected. They’re extensively using API’s, again, Ethernet control, they have color accuracy, because of the bi directional nature of the systems. And so they’re able to actually query what the LED tiles are capable of based on their metadata, that metadata is able to travel all the way upstream via an API, so that a server can intelligent intelligently understand what it’s driving, and what its capabilities are. And then further to this, by using this, this infrastructure that’s fiber based and more it friendly. In this particular facility, there’s a server room. So all of the all of the disguise servers, all of the Helios processors, and any kind of ancillary equipment that’s a little bit upstream or downstream of that it’s in a server room, it’s not in this room. And so what’s amazing about that is that we don’t have the noise, we don’t have the heat. And if more processing power needs to be added to do higher frame rates, or to do more rendering power for ray tracing and on things that need to be done in real time. They can keep adding servers. And that’s all in a controlled, air conditioned environment that’s not impacting the footprint of this particular space, which is very valuable for talent and set pieces and things that are that are coming in. This video that’s playing is Camila Cabella. That was released on tick tock and this was a very fun one, I think, for them to do, very much showcasing the XR part of xr studios. And keeping in mind with that server infrastructure. It’s in this facility, but it could be five miles away, or eventually it could be living in the cloud. And so that’s where we want to think is to move these things farther and farther out. So that people can be much just much more flexible, in how and where they deploy the equipment that feeds the displays. And I’ll talk about one, one additional project. And I want to kind of tie this a little bit to the first chat that Laura had talking about labor and being able to try to have normal work hours, we’ve been trying to think about on the Technology side and as a manufacturer, what can we do to actually help the technical staff that needs to maintain and operate these large systems? How can we make it easier? And how can we make it so that you can do these things from anywhere. And so Euro vision is a great example. This was running our processing systems, it was also running, of course, large lighting, arrays, lots of power distribution, and all kinds of Projection mapping as well. And so we have a product called omnibus, that is really cool, because it’s able to take not just LED video, but also Projection, power distribution, networking equipment, things like video routers. And it’ll actually consolidate all of this into a dashboard view that lets you see all of the equipment in a single pane of glass. And so we’ve taken a little bit of like, an IT infrastructure, infrastructure monitoring perspective and thought, how can we take AV systems and bring them into this world. And so this is an example of a screenshot from Euro vision, where we see the different led walls there, they’re very hard to see on this on the screen here because it zoomed out, but there’s little Projectors that you can see in the middle, and everything gets color coded. And again, this is all bidirectional data for devices that support Ethernet control right now. So this is kind of like our call to action of you know, if there’s if there’s a serial port on a device, but not an ethernet device. We ask all the manufacturers like please make all of this Ethernet based so that we can all be sharing this data. And so, as you drill in with this system, you can see visually networking router ers status of processes all these kinds of things. And so what we see benefit here on is, of course, when you’re on site and in a production environment like this, there’s going to be an LED led technician on staff, there’s going to be a lighting technician on staff. And so their job is to actively ensure that these things are running. But what happens when you actually want to leave when the carpenter leaves, right? You want to know that in the morning when the show turns on, that all of the pixels are going to light up? And does it make sense to have to come in at 5am for ESU, when the screen doesn’t need to be operating, actually, until 8am? Why can’t you have a system like this and login from the hotel when you wake up? And yeah, you go in early if you need to, but you don’t have to be sitting in the environment the entire time. And we think that this is something that really can help shift a lot of how people are thinking about using monitoring and maintaining the equipment so that you can do things in a smarter way and not have to be you know, sitting in front of it all the time. So, that is my presentation on Ethernet and communication. Thank you very much
lighting, video, led, data, systems, tile, pixel, control, ethernet, server, metadata, network, router, devices, screen, color, processing, running, megapixel, equipment
Laura Frank, Jeremy Hochman