Sponsor Session – Pixera

11:00am in Studio A

Lights, Camera, Color!

Over the past few years, innovation in virtual production methodology has led to an explosion of creative possibilities in the film and TV industry – but the fundamental fact remains: on-camera subject matter still needs to look good and natural looking colors need to be faithfully reproduced. Join this panel of experienced virtual production professionals as they discuss challenges, opportunities, and strategies for properly lighting scenes on a virtual production set. Topics of conversation will include image based lighting, media servers, color science, control interfaces, and more!

Download Transcript

Laura Frank  00:12

They are going to be discussing Lights Camera color. I’ll let everyone introduce themselves when they come to stage gentlemen, come on up

Conor McGill  00:28

Hello, everyone, thanks for joining. My name is Connor McGill with Pixera. We are one of the sponsors of the event and to use our sponsored time I was thinking about here on stage about what what topics covered that intersect with the Sara feature set, but also, you know, have our topical, interesting. So I was of course attracted to the most fraught, esoteric, contentious, subjective kind of thing that I come into contact with these days, which is color and light context of LED based virtual production. So the topic of the panel discussion today is that I as a representative of a manufacturer of one of the pieces of equipment that’s involved with LED production, I, secondhand source I am I there we go. I have many, many second hand data points. And most of those data points are speaking with people who are practitioners who are virtual producers who own studios or run led studios like this one. And some of the most exciting and kind of original ideas that I’ve come come across have been from these, these fine folks here who don’t work for pyxera. They are their friends and colleagues from across the industry. So to start, I’ll let them introduce themselves. And then I’ll frame the discussion a bit more and ask some questions. So I’ll start with you benefit to introduce yourself and talk a little bit about your relevant experience.

Benjamin Dynice  02:12

Sure. My name is Ben deines. I came from the film TV world I was a lighting programmer for 15 years doing movies, commercials, music videos, TV shows, for high end client clients, Marvel, DC, Sony Apple, that type of stuff. I got involved into the manufacturing side about eight years ago, I worked for a company called Quasar science, manufacturing LED lights, and really got heavy into the lighting control side and the Technology side that had to do with film production. From there, I was involved with the unreal fellowship for ICB effects, getting deeper into the virtual production side. And really, we started to embrace what we call image based lighting, which I think is part of what we’re going to talk about here. Along with my my cohort Tim Kaine a quiz or science, we really delve into that and try to get you geared up for Tim if you know delve into the the the Technology side of how you approach that. Recently, I joined the team at aperture lighting, and brought him King along with me. Now natural lighting. And there we are, we are trying to push forward. Lighting Control push forward, virtual production lighting, because the big topic that we talk about a lot in virtual production is the the environment that we create with the volume. And even if the LED volume gets better and quality with with, you know better spectrum and things like that, we’ll still need to light the set, you still need the hard light, you’ll still have to approach it in the same way for lighting. So there you go.

Conor McGill  03:50

Thanks Ben. Marcus.

Marcus Bengtsson  03:51

Thank you, Ben. So you guys hear me? Yeah. My name is Marcus. And I don’t have a background in lighting. I have a background in computer science actually. That’s really how I started my career in this in this business. However, I did run into Ben in my previous previous job where I was representing a company called lumen radio who produces solutions for wireless DMX control. So that’s how I kind of entered into the lighting industry and started understanding the requirements and the needs from a lighting perspective specially especially in cinema and TV. Then I stumbled on a couple of banana peels and ended up with a company another media server company called disguise and worked with that company for quite some time producing solutions with my then old friend Ben and other lighting professionals in the in the film and TV industry and now with a company called Reginald syndicate. And we position ourselves as a studio innovation company where we produce innovative solutions for you are specifically for virtual production. And one of my areas of expertise or necessarily expertise, but that I find myself in more than often enough is really figuring out how do we control traditional lighting fixtures with modern day media servers.

Conor McGill  05:18

Thank you, Andy. Hello.

Andy Jarosz  05:21

Hello. Hello. My name is Andy. I am a virtual Production Supervisor. Haha. I’ll give you that one. My name is Eddie. I’m a virtual production supervisor and based in Chicago. I also come at this from the sort of film and TV industry. I started off over 10 years ago now as a Camera assistant. And then I became a union special effects supervisor for many years, I got into virtual production around 2019. And I’ve been a serial developer Tinker throughout all that time. I currently run a company called Low led virtual, which is a sort of r&d and consultancy company based in Chicago for virtual production. I also run a company called Smash virtual, which is the largest led volume in the Midwest. And I am someone who has just always been super into filmmaking tools and Technology and the science behind it, and developing new and exciting ways and trying to learn as much as I can about all of this stuff.

Conor McGill  06:22

Thank you, Andy. Excellent. So to ground the conversation, I’d really like to connect the this topic with the ethos of framework or live pixel people. So Marcus is starting with you, could you give us a little bit of the genesis of the use of LED direct view led tiles in the film and TV production world and kind of what the state of play with that Technology is and what you think some of the main benefits are over conventional, either green screen or on location, practical studio shooting?


Those are many questions in one part of it to start? No, I think I mean, I think we look to the history of worker production, or the history of Image Based Lighting, I think it’s sitting right in front of us here, like everybody who’s in this audience has been part of taking the Technology and taking the storytelling tools that we have to our disposal today, to where they are today. But I think if we if we look, back in time where this this whole thing started, like if any of you were here, during the SMPTE event yesterday, there was a very good presentation by Lux Machina, on how they’ve pushed the envelope starting with using prediction mapping back in, I think, release 2013. When they did Oblivion that was a full wraparound of using Projectors instead of LED. And then obviously, we moved from there into using more led, and initially only as backgrounds for work that was done in Camera as a replacement to green screen. And then during that time, I feel like we were still just uncovering some of the problems, we still are uncovering some problems on a day to day basis. But we were then uncovering some of the problems that came with using these direct view LED video walls or whatever we want to choose to call them today as light sources, right. They are narrow spectrum led, they are meant to be viewed with a Camera, they’re not meant to be used as light sources. I think that’s really where the whole transition into using more traditional lighting fixtures in as a combination or as a augmentation rather, to the LED screens really came about. But at that point, we still hadn’t really figured out how do we integrate these things into holistic workflow. How do we use the same imagery that is being used to display on the LCD screens to also power or control or lighting fixtures. So that’s really where the push into interactive image based lighting started. I think that was probably back in 2020 1920 20. When we started moving in that direction, there’s still quite a bit to go in terms of interoperability and making sure that a lighting fixture understands color, color and lighting day that exist inside of video file in a in a sensible way. But to feel aware where definitely moved quite a bit in the right direction.

Conor McGill  09:53

Excellent. Thank you. So picking up on the thread there just talking about some of the technical dimension of it. lidi tiles being narrow spectrum RGB. Andy, could you tell us a little bit about some of the the challenges that you face in your LED volume related to that?

Andy Jarosz  10:10

Yeah, absolutely. So the issue that you’re going to run into when you’re using an LED panel as light source is something called metamerism, which is a phenomenon in which two objects which are different colors can appear to be the same color under certain lighting conditions, which is a very vague definition, because metamerism is a big concept it’s been been known about since the mid 1800s, to call repeller of color vision and colorimetry. In general, practically speaking in the virtual production standpoints, what it how it sort of manifests itself as if you have an object that reflects very specific parts of the color spectrum. Well, LED panels only emit three colors, right, red, green, and blue. And that works for our eyes and for cameras, because they only see red, green, and blue. But an object can absorb and reflect any arbitrary amount of wavelengths. So if you have an object that very specifically reflects a yellow light, you’re not illuminating it with yellow light, you’re only giving it red, green, and blue. And so in sort of the best case scenario, you have an object like human skin, which absorbs and reflects a multitude of colors. And so it just sort of results in things looking a little bit off and just not quite as pleasing or as accurate. And a very extreme scenarios, you can actually wind up with a situation where the object looks just black, because you’re simply not giving it any light that it can then reflect back at you. And so this is something that I think often gets overlooked because it is something of a technical thing. But the issue is that it manifests itself in kind of the worst possible scenarios, like when we’re shooting products, right client wants a product to look as accurate as possible, they invested a lot of time and money in that specific color, and that manufacturing process to maintain that color. And so you then illuminate it with light source that isn’t providing the full spectrum, it’s not going to give you the accurate color, and it’s going to cause a problem. So this is where, you know, Image Based Lighting, this is where using just actual studio lighting fixtures to augment that. And even the new you know, like Brompton technologies with integrating RGB W pixels are going to be really helpful.

Conor McGill  12:09

Great, that’s fantastic. Thank you. So to continue on with this idea of image based lighting. First of all, for those in the room who don’t know, I was gonna benefit, maybe could define that. And then talk a little bit about some of the technologies that you’ve been involved in developing to address the issues that Andy has brought up?

Benjamin Dynice  12:25

Sure, yeah. When when you’re creating an LED volume, what you’re essentially doing is you’re making a a big, giant, soft source of the world around you. So when you’re creating this world around you, it’s it’s all soft light. And as Andy just described, it is all very spiky spectrum soft light. And so the way that you could combat that is to put the same type of imagery, the same video, while the same Content, instead of running that through the wall, you can actually run that through lights. Well, there’s an interesting thing that’s happened over the past. While it’s probably happened forever in Technology, where a problem occurs on set, the existing Technology is used to try to solve that problem. And you kind of get there, you get to a point that you’re like, oh, this kind of works. But then you can iterate the Technology further, and you iterate the Technology further, and you solve that problem. But guess what, there’s another problem that occurs. So now you iterate again and again. And Image Based Lighting has only really come about because the Technology has been able to now do this right before we could just light with a big white panel a different color of a white panel or pixelate the panel. But now, with the pixelation that we get in the LED walls, the pixelation that we get in LED products and LED lighting that actually have, you know, 567 colors, we can create that same environments in a soft light. But now we have full spectrum lighting or as close to full spectrum lighting as we can. And so with the research that we did a quiz or science, we were we figured out the right pixel size to do that. Because if the pixel is actually too small, the lights, the light actually starts to get a little muddy, it doesn’t really get that interactivity that you want on an LED volume, you don’t get that interactivity if the pixels are too small. So you really have to find a way to do the interactivity with the right size pixels to get that environment to, to you know, affect your actor, because in a volume, there’s really two aspects of believability that you have is what is the actor touching? Like a prop, a bridge that they’re on that extends in the background, you know, something physical that’s repeated in the environment, what they’re touching, and then what’s touching them and the only thing really touching them is the lighting at that point. So you have to have that really be the same quality of the cinema lights that we’ve used for a while and you get that with the with image based lighting because you could send that media to the pixelated array to create that in your environment.

Conor McGill  14:55

Excellent. That is amazing. So this all sounds great like We saw the problem. But done Marcus, you’ve said some interesting things to me about a, you know, the deficiency of DMX. And because these image based lighting fixtures are typically still communicated with DMX over sec and art net, so if you can speak a little bit about the efficiency of DMX. And also, you had already mentioned in the conversation, the ability of these fixtures to interpret video data in a semi simple way if you could elaborate on that a little bit too. That’s, that’s interesting.

Marcus Bengtsson  15:28

Sure. Absolutely. I’d love to we only have nine minutes left so but I think

Conor McGill  15:32

we’ll just be over here just continuing to talk off

Marcus Bengtsson  15:36

the deficiencies of DMX I mean, it’s it’s that’s a topic that we can probably talk about for for a very very, very long time DMX. Even though it’s still heavily used right then in the in the shape of running down and fine five pin Rs 485 cable as ArtNet, or as streaming ACN. I mean, it has many different names. But ultimately, it all suffers from the same deficiencies. It’s a unidirectional protocol, it doesn’t carry animated metadata, it doesn’t carry any contextual information whatsoever. It doesn’t support the highest frame rate that we’re today using like DMX is limited to 44 hertz refresh rate. If you’re within one universe of of DMX, it doesn’t support synchronization, for example, today we are everything around us actually unlocked, there’s no way we would have been able to genlock a traditional DMX controlled light. That’s only the start. If we then look at the applications that we’re using DMX for today in terms of controlling lights with not necessarily with a console anymore, but we’re using a media server to drive the lights, we’re playing back video videos that may include HDR Content, HDR Content that is calibrated to a specific peak brightness, etc. There’s no way for us to actually tell the light in a smart way, what to do. So in in, in a way, and it’s a little bit ironic that we are sitting in a in a scenario where we have so much control over how we control our video devices, right, we have control over what color space we’re using between the media server and the processor, we have control over exactly how every single pixel and visa details around us are behaving on a microsecond level. But on the lighting side, which we depend on so much in order to actually create the believable product, we don’t have that level of control. And I feel that is something that as an industry, we need to start asking tougher questions, both to manufacturers on both sides, but also to the standardization bodies to ensure that we get to a point where we have the same level of control over lighting fixtures as we do with with video devices.

Benjamin Dynice  18:03

My counterpoint,

Conor McGill  18:04

you can go for it. Absolutely. Yes.

Benjamin Dynice  18:06

Well, I just like to say that yes, DMX does have all those deficiencies as a lighting manufacturer and DMX has been my world for the past 20 years or so. We have we’ve been coming up with ways to overcome this as you know, SAIC and multicast it it’s the best form of DMX, even though it is Ethernet based fare protocols. Because at the end of the line, you know DMX is still 512 channels, it’s still it still gets conveyed in that way. And when you get into these highly pixelated lights there instantly a universe of DMX. So I think as a lighting manufacturer, and as a programmer, you know, I think the responsibility is going to going to come to how do we improve this because a lot of the governing bodies that regulate the protocols, by the time the manufacturers are ready for something new, the last protocols finally getting pushed out. So I think we have to iterate faster. And I think we have to try more things. And what’s really interesting about virtual production is since it’s this constant state of experimentation and find the workflow, and there’s not one way to do things, I think it’s providing a more fertile ground to experiment. And so as a manufacturer, I think we have to keep pushing that. And the way that we push it as a manufacturer is through people using it and the users and you know, the professionals like yourselves out there, discovering the new problems pushing people to find the solutions so that we can make better tools to use essentially

Conor McGill  19:35

fabulous. So Andy is somebody who runs the stage day in and day out. What are some of your thoughts on the the current state of the Image Based Lighting workflow versus conventional lighting control? And do you where do you hope it goes from here?

Andy Jarosz  19:49

Yeah, absolutely. I was saying earlier that I think kind of brought me here is sort of the foil to the Image Based Lighting conversation because our experiences with it so far have been somewhat mixed just as far as the state of the Technology. Jean, where it’s at. I think that this is something I see a lot in the virtual production space. But a lot of companies sort of operate in this vacuum of virtual production and the solutions to problems that come up with only sort of work in the virtual production context. But it’s important to remember that the tools that we make are being integrated into a production workflow, a very well established production workflow that’s used every single day all around the world. And if the tools that we make are not readily able to be integrated into that, then it’s always going to be a struggle, it’s always going to be difficult to get people to adopt them. It is tough people in the film industry don’t like to change, they don’t like new things. It’s not an industry that is conducive to taking risks. And so you kind of have to sort of trick people into trying out new things by making sure that it’s highly interoperable with the way that they’re already used to working. So when it comes to image based lighting, the current state of the art sort of completely removes a lot of control that cinematographer and that lighting programmers have previously had, and replaces it with driving it with a video. And the thing is that the videos are not shots, most often with the intention of being used as light sources. They’re meant to be viewed directly. And so often you want to be able to make tweaks, you want to able to change things the cinematographer wants to be able to make creative decisions. And right now, it’s very difficult to do with the current state of the art. It’s it’s not always easy to make those changes. And that’s really frustrating for creative, folks.

Conor McGill  21:23

Absolutely. And Ben, actually, you said, you’ve given me some really interesting insight on set dynamics, with the kind of the politics of the shifting the tectonic shift. And can you maybe speak a little bit about that? And where you see the opportunity is to kind of educate or indoctrinate, if you will?

Benjamin Dynice  21:39

Yeah, with undulating people weren’t sure, with with the Image Based Lighting concept, you know, and even using LED walls as light source, it’s come down to who’s in control of that is the DI T, changing the colors, the color or the color space of the LED wall? On the fly? Does it go to the lighting programmer? It really ultimately comes back to the wire, why are we doing this? What’s the creative decision that’s being made? You know, either the director or the DP or the vatsim? How are we? How are we arriving at this decision and to come to this conclusion, normally dictates how it’s going to be controlled on set. And I think, as we keep pushing further, and as virtual production comes down in cost, it will really help it to for image based lighting to kind of spread out to like Andy’s point, to kind of spread out and be more widely adopted, because it’s not just a virtual production tool. But it has to come from the creative side, they have to be able to justify the expense, ultimately to use it.

Conor McGill  22:37

Yeah. Excellent. So we’re almost at the end of time I invite if anybody has any questions, I don’t know if there’s a mic or if any of the panelists have any closing thoughts. I would invite them now.

Andy Jarosz  22:50

You have two minutes to ask questions

Conor McGill  22:53

to Okay. Let’s anybody have any questions? Otherwise, we’ll end early. Sure. Okay.


Hey, I’m Brandon. I’ve worked with Tim actually quite a bit at a stage up in Seattle for some image based lighting, using Pixar, all that stuff. And one of the things that we’ve kind of briefly talked about, and going from, like, led Well, I’m from a more like, DP, Camera. Lighting is like, all I care about for that stuff. And something we were talking about was like, Is there a potential of because like that studio in Seattle, I’m having issues where like, the LD is like, super nervous about all the image based lighting, because he’s using, he’s losing a lot of control. And one of the things about that is like, all these lights that he wants to use, obviously are not, they’re not calibrated to the same color spectrums. Every light manufacturer has their own color spectrums are in and like I do comment and stuff. And is there a potential of doing something like that almost like a lot based to bring all these lights into one like an Asus workflow, right? For lighting? Yeah,

Conor McGill  23:55

there’s a question I was gonna ask you, Marcus, we rent?

Marcus Bengtsson  23:57

Yeah, that’s a really good question. Okay, thank you. There has actually been some some work done in the it’s called the ASC NTSC is the American Society of Cinematographers. We’re working on producing recommendations and best practices to lighting manufacturers on how do you calibrate your lighting fixtures to specific video based color space, rec 709, rec 2020, and so on, so forth, and also recommendations on standardization around DMX profiles that will make it easier to control multiple fixtures from multiple different manufacturers using the same profile and you have a higher degree of predictability.

Benjamin Dynice  24:40

Yeah, the first step of that is we’re doing a push to get everybody just to start with rec 709. Because typically, in RGB profile, is that when you when you set the light to full red, that’s what the manufacturers decision of what red is. There’s no standard for that. So if we could first decide, make me an RGB profile that’s 16 bit that is in rec 709. I think that’s a great place to start to to solve those problems. Because previously when you were doing image based lighting, and the issue was this doesn’t match, I wouldn’t pixel mapping these lights, and it’s way too saturated. Now I got to do all these changes to it to try and make it fit that what I’m trying to achieve. Well, it’s because it doesn’t have a color space yet. So that’s what we’re trying to do.

Conor McGill  25:23

Excellent. Well, clearly proving fertile ground for continued conversation. So I invite anybody who wants to come to the conversation to see us out in the lobby, but I’ll throw it back to Laura. Thank you. Thank you. Thank you, panelists. Appreciate it. Thank you.


image based lighting, dmx, lighting, led, virtual, production, color, control, manufacturer, light, called, lights, andy, pixel, push, fixtures, light source, work, talk, workflow


Andy Jarosz, Marcus Bengtsson, Benjamin Dynice, Conor McGill, Laura Frank