Zoetrope Style Screen Designs – Then and Now

2:00pm in Studio B

Bruce Wheaton of Experience Engineers will take you Inside the Zoetrope. You’ll see why immersing the audience in screens is as powerful today as it was 25 years ago, and how to execute on it. What seems impossible might just be very, very hard.

Download Transcript

Bruce Wheaton  00:13

Good afternoon. Welcome. Glad to meet you all. See some of you again. What an amazing conference. It’s been so far. Thank you, Laura. And XR Studios is a great facility isn’t it? So before I dive in, I’ll remind you what a zoetrope is. First built around the 1860s Zoa. Trips are experiments in moving pictures. They use an inward facing disk pictures revealed to the eye by slits, rotating on the opposite side from the images to create an animated image that looks like it’s moving. It’s a fun little exploration into persistence of vision that helped bring us to the point we’re at today. So do the two shows I’m talking about today use that literal mechanism. Now they do not. If you feel cheated by that I understand. This presentation would have been even cooler if I were describing massive steel sheets spinning around the audience at unimaginably fast speeds. Anyway, sorry. No, the events I’m talking about today. Both had a ring of screens that suggested to me being inside a zoetrope. And I think being surrounded by imagery, even if they’re not spinning at hundreds of feet per second, really unlocks an extra level of immersion for the audience. I would put it this way. So I’m not gonna focus on this. I mentioned in the technical problems that this presents more than why it works to capture people’s attention, and especially how to solve it without just making the screens bigger, because well budget. So I’m gonna do what I suspect I’m known for. I’m going to get too technical, too quickly. The first I’m going to tell you about how I got here. And a second, I’ll tell you how to find out more. I started as a video technician in the UK, sort of by accident. I think lots of people find a way in live events by accident. There’s no recruiting offices that I’ve ever seen. Moved to was big events came when I started to deal with big video Projection, and these light valve Projectors at the time. And, oddly, that involved a lot of video engineering, because back then, that was the only way to do something as simple as crossfading between two video inputs, and multipoint. That’s where I found an excellent mentor, Tim Volca. He had scant regard for what others thought difficult or impossible, and was endlessly curious and inventive. And, well patient, I’d say with the benefit of hindsight, and I hope that I’ve always been endlessly curious. So in the freelance years after that I dabbled into video will programming found it has some kind of knack for it. Around that time, it did some rock and roll when I also moved to the US. I live in the Bay Area didn’t many, many years of Show Control programming, generally ready to video screens. Since video companies were my client clients. And I did my time project managing video crews. And spoiler alert, they’re actually kind of difficult. So I taught myself C++, since basic tools weren’t getting me there. And I was able to build some applications and a media server, spend some time selling those in between shows. And I did hundreds of shows switching screens Spyder encore, etcetera. High Res engineering, we called it at first, I was also teaching and mentoring other texts as much as I could along the way. Then I decided I need more to play. So I pivoted to Creative Technology and daunted by the sheer number of software applications I would have to muster with all the Adobe suite 3d modeling effects. I decided to focus energy on TouchDesigner one app to rule them all. And we’ll talk about Touchstone later. And I’m pleased to see it being featured in other talks. It’s got a steep learning curve, but it can do almost anything. Then after a while, we started to use disguise media service for scalability, so we could sub rent gear, so we could find talented programmers. Then during the pandemic, where everyone else learned to bake bread. Instead, I learned to make full stack web application so any big is in the room. Okay, you win. And now I’m working as a problem solving creative technologist and a new adventure I’m hosting a podcast backstage wizards. The focus is on under document in industry, how people have solved interesting problems on shows and what stories and adventures they had along the way. With self launching today with an episode featuring frameworks own Laura Frank online, and she took time out of organizing this conference to share some amazing knowledge with me. We have an episode with Matt Ward, who you’ll hear about in a moment. Coming next week. Please check it out later. So let’s get into the to inside the zoetrope shows. I was the lead programmer for a very interesting tradeshow booth for Turner the TV network in the late night. 90s We faced a lot of challenges, we delivered a result that baffled our peers. Maybe it does to this day. Consider if you were asked to feed 65 independent video screens, what would that lift be like in your workflow? Then the BMW NEC is a regular annual show for storyteller. And that show gets creative extras in return for the client having a lot of imagination and trust, and cool cars. So I’m talking about the shows what was similar, what’s different, and how we solve problems in different ways, based on the tools we had 25 years apart, and hopefully, you’ll be able to see how the journey and the challenges along the way have led me to a creative problem solving approach I use now. BMW off to sales, it’s a it’s an annual conference for the brand’s communicate, women celebrate their dealers and service managers. Lisa Sato was the creative director. Sullivan Taylor was the production designer, and they’re both from a storyteller, a Bay Area creative production agency where I also worked as the seat to his here’s what we had. We had a six meter by 3.5 meter screen with 1.5 mil row led Helios processing, were to three meter by three meter side screens a 1.5 mil LED. Then here’s the interesting part, we have 36 Free hanging panels, a Vanish eight mil led, and that’s a transparent LED product if you haven’t run into it. But then also we had 24 Mirror panels suspended in line with the LED screens. Lisa Sato, the creative director explained the party the design essence, for BMW and a C 23 was the idea of reflection and expanding space, representing the BMW teams who they are now and who they are becoming game changes can change is one of the main themes and this theme led to the use of photographs you’ll see in a moment. And Sullivan Taylor, the production designer said to fulfill a party for BMW NEC, the integration of mirrors into the production design was essential. The road mirror panels, despite being in a straight line, completed the circle of vanish LED panels visually. Let’s see how that looked. And it was pretty cool. The smooth one at the turn of both the Turner networks booth in Atlanta in the late 90s had a very ambitious video system that our company inherited, I guess, when some other vendors dip down since it was an ambitious ask Matt Ward who’s here today was had a video we’ll leave it Yes, due to the programming needs and his reputation for delivering welders could not this fell into his wheelhouse. And he was assisted in the decision to take on the project by his Dark Lord. At least that has long been my assumption. And Matt has never denied it. We put our heads together recently and to the to the best of my recollection is this what we were dealing with. We had a center three by three Projection video we’ll be using to Sheba cubes at the time. With 32 TVs and a row above the audience on the right. Six speakers in Windows 32 TVs in a row above the audience area on the left, and six speakers. And the main ask was a cool ticker tape scroll effect across the TVs. So Matt points out they wanted that effect. And we had to back into that from a technical point of view in order to figure out how it was remotely possible. And his collaborator consumer notate the tastes of souls and anguish during pre programming was exquisite. I feel like I owe Matt for that. We don’t have any documentation sadly, but I tried to recreate it for you. Here’s how they looked. And here’s what we had to play with. So if you took some time, you might be able to work out that the numbers don’t add up. For instance, the video will process it was 16 outputs can’t feed 64 screen And obviously four or five video players can’t feed 64 screens. So how do we do it? It’s not simple, I’m going to try and describe it. So let’s start in college cheating if you want by feeding all the screens on the left with the reverse routing of the screens on the right, so luckily a broke us router using a two output connectors per router output. Not to get too far in the engineering but we were dealing with composite standard definition video at the time and delays for ATI. Each to Remy DDR held all the videos and letters or stills. My best recollection is that one deck played videos and output audio and the other four just delivered stills. And I suppose, knowing what we know now I could we could have just rasterized an entire font and symbols onto video. But it’s quicker for us with where every single frame we need to be picked out in a queue. To get the letters we needed in order a few frames for every letter. So we’d started video playing on one DDR with a tracks Q routed to all the screens. So no, we could we could cleanly switch between signals something that’s oddly difficult to do today. That’s because all the sources were Gen locked and identical. And the broadcast router made by paisa supported a vertical interval switch, meaning it switched invisibly in the gap between video frames. Having clean routing and a system breaks the lot between the quantity of processing and the quantity of outputs, which is still a valid goal. We did all the control with data on tracks. You may have heard of watch out there media server app tracks running on an old trackball, Mac laptop, state of the art at time, sent cues to devices on serial connections via a device called a SmartPaks. Part of part of our task was creating drivers to do reliable serial control of every device we needed. And creating serial cables with exact wiring needed to communicate. No IP, no IP addresses on site. As much available processor. So we know it didn’t have enough outputs to feed all of our screens. But what it could do was programmed freezing of any of the 16 outputs it did have. And that was Matt’s crazy idea. That was one of Matt’s crazy ideas. All right, that was one and a frankly frightening series of crazy ideas. So feeding the DVRs into the video will processor then the video will processor into the composite router was the secret sauce. We loaded four letters into for the DDS and then unfroze four outputs on the video will processor routed to them. And then we froze them again. Now, none of this is visible to the audience because all these outputs aren’t routed anywhere. Then we repeated the cycle with the next four letters. And the next four video will outputs and so on. Until we had 15 letters spelling out the words turn a channel plus a still we wanted to leave behind the letters remember that in the world of video routing, as opposed to generating Content, you have to send an output to each screen there’s no black there’s no nothing, you got to feed something. So maybe you can imagine what this is going. We said we routed each letter in turn to each screen and started one screen from the left and routed them all again and so on and so on decided not to make this too much torture by not showing all 32 outputs Okay, so let’s let’s see that again in context. So here’s Matt, me happy the plan worked out. We don’t have any pictures from that era, but I am pretty sure this is close to how we looked at so all it took now was was a hanging gun of Q’s pacer router queues and tracks. A single move to spin the letters across the screens was 900 individual cues in the tracks timeline. If we wanted to change the timing, and oh, we change the timing. So so many times, we needed to move 899 of those queues by just a few fractions of a second. But what we found out if we routed as fast as we wanted, more speed he’d being a smoother animation and a high frame rate, the routers processing would crash. So we spent a few days, just finding out how slow we needed to go reliably not to crash the router. We want it to work right up to the line, obviously, as I’m sure you would. I don’t have X access to tracks all the show files anymore. I don’t even own a floppy drive. But for reference, here’s the queue system duplicate in touch designer just you can see the quantity. And this doesn’t express the timing really just the routing. The tracks queues were more complex. Each tracks queue at a specified destination and a source for every round. And then we had to execute each animation step as a router salvo. We used overlapping audio fade cues to pan sounds across the speakers to get a feeling of movement in sync with the images as well really enhance the effect. And at the time, audio spatialization was a bit of a dark calm. I sometimes think the hardest part was convincing my horrified audio friends that I didn’t want to mix it with 16 inputs and four outputs. I wanted a mixer with four inputs and 16 outputs was there was a lot more to ventures and timing. If you remember the images, we had to grab this video stills? Well, as you can imagine, we wouldn’t do that quickly to and that was key to how quickly we could get between different lines of text. The load unfreeze freecycle was was another delicate dance of timing, complicated by bugs and quirks in the DDoS. And drivers that caused occasional corruption, explaining why some of us refer to the show was tar tar near work. Yeah, and that was another adventure and test fixes and incredibly fine timing tweaks. Because much as it amused us, we had a feeling it wouldn’t amuse that network as much. And Matt just reminded me the other day the occasional letters would still sneak through but we just had a brisket. Okay, so that was it. The AV Mr. The 90s. Unfortunately, now you know all know how it’s done. We do have some cleanup. If you could look at the screen, please. Good afternoon. Welcome. Glad to meet you all. What an amazing conference has been so far. XL Studios is a great facility isn’t it? So the queues fatahna could be regarded as one long timeline, tracks had a frankly excellent timeline that I’d be happy to have access to and shows these days. But it wasn’t great scope scope for modularity. The closest thing to a variable was to use a fake device at the time. When the timeline looks a lot like a grape vine. By the time we finished this long span with huge clusters accused hanging off it, we had to be sure that the big block of movement cubes was just right before we copied it so many hundreds of times. If I had the tools I have now it would have been a lot more efficient. recreating these effects to show you how it worked was much much quicker. Although Chris who helped me make the slides maybe wouldn’t agree. But I want to pass up the learning experience for anything. And I know Matt and Khufu also got a lot out of it. Alright, well let’s see how we faced some similar challenges in a modern setting with modern tools 25 years later. So in general, we’re now in the age of the monoliths, we have very capable high cost systems, and to do everything in one system. With few boxes fully mature media service. Scalability now means just make sure you have enough outputs, since every output can do almost anything you need. Of course, if your system doesn’t do exactly what you need, it’s not like you can use two different model it’s within a reasonable budget. We do still have some modularity in the system. An LED processor can act like a video will process taking a defined chunk of pixels, distributing them evenly across panels, and BMWs. That was step one, we defined a six by six grid and shared that with the LED team for addressing 36 Row vanish panels were hung in an arc over the audience facing the stage. And since that low resolution with an eight mil pixel pitch, each panel is only 112 by 112 square. We were dealing with one signal output, even handling all 36 screens together. So we use disguise to feed all the main screens at BMW via Bacary, too, so we decided to just also feed the Manage panels from the skies. That meant we could run the opening spirits as one time video and switch cleanly on cues in the disguise timeline. In the innovative BMW design, remember the mirrors. They were carefully positioned in orientation to the Vantage panels, so the audience perceived a complete circle of screens. Some of the units were behind the Manage panels so they only saw the reflection of the panels. But since the entrance was from behind the main screen that they were all inside the zoetrope from the stone well Let’s look at a piece I called the carousel. It was a photo montage of the attendees, some of whom had come directly from having their photo taken just five minutes before on a full blown carousel play during walk in, and it was featured again during the opening experience, it was key to helping the audience feel that they will the game changes. And apart from Dynamic access to pictures taken moments before the creative code for the ratio of images to blank space, and to change speed to be varying as the weekend progressed, building to the screens being completely filled with pictures and changing rapidly. And that will mean a fixed program 30 minute long timeline wasn’t a great choice. I mean, consider the challenges on Turner of having so many cues almost identical and then trying to change them. So instead, we decided to fit a generative system into disguise via NDI and that will video protocol. Feel like everyone knows what NDI is at this point. Using NDI, instead of switching in the two, for instance, meant that the media server timeline had control of when the generative graphics will be visible. We use the exact same grid layout for the NDI, so we didn’t have to make complex mappings in the media server. On the generator system that we go to first is touches on a visual programming system. There are powerful alternatives, we know what we can do with touch. Most of our projects involve touch center in some way. It’s also nice to be able to scale work on your own laptops when you’re building. We’re in touch center attorney Knox and then also a server grade PCs with quadro GPUs and SDI cars. Alright, let’s get technical again ready? Yeah, visual doesn’t mean simple. touches on it has a really nice replicator system means you’re going to find a complex system inside a component as if you wiring a complex video and control system, much like we did for Tona in a way, and then and then replicate or clone that a number of times based on other parameters. Each copy will change live as you tweak the master. And you can feed unique parameters into each for instance, to have them linked to a specific item in our list of images. For the carousel, we decided, focusing on just one screen here that we would feed in filepath are a blank string to represent black and when it changed, have the component automatically fade down, change the image and then fade back up. There was again a little piece of timing to that ironically, similar to the timing challenges interna. touched on it has shops, or channel operators. These are touches on his nodes that handle numeric values that can generate a curve pulse channel. And we can then apply logic to take actions when the pulse touch zero. And wiring the pulse channel to dissolve to black made a nice dip through black. When it helpful Black was switched out the image by and unfreeze and freeze just like antona then faded up to the new image, or to black if we were removing the image. And we also use tops or texture operators touches on his nose that process video to crop the photos to square do some color tweaks make the images black and white, make the black background disappear, and then do with high quality downscale to the limited led resolution. Then each clone did the exact same job but triggered by its own row in a table. For assigning what went on in screen, what was in each table row, we had the choice to use a complex setup a visual logic, or to use the tight integration with Python and touch designer. We could use pure Python. But mixing visual programming and code is a huge boost in development. The visual representation of tables and text fields and waveforms becomes a powerful visualization into what your code and the system are doing. literally seeing table rows change versus trusting they’ll change the way you the way you hope seeing versus imagining that that’s a game changer. So the Python code triggered from a timer that was a very uncontrolled looked at the tables that had a row for every vanish LED panel. The code picked the blank row and then picked an image from a watch dropbox folder and another table and cross reference against the images already visible on screen to avoid duplication. Essentially, filling and clearing rows in those tables was the driving mechanism for the entire piece. You can see that we broke the sections into batches since the fullness looked better when we handled it in one section of the panels at a time. Pseudo randomness is a deceptively complex problem encoding and getting live inside as we created it was gold. And just like tuna, we brought the toss down until with a step by step then put the results back together. The actual live updating the images from the onsite photographer didn’t feel like a significant challenge. Since I worked on so many Facebook interactives doing social media scraping for instance here with Facebook Messenger added support for images and gifts. So Dropbox can push images round touches on it can detect files changing a folder Boom, done. Well, now we have now we had 36 clone touches on a components, each of which making an image the exact pixel size we need. replicators do have a script function that we could have used to programmatically wire up the outputs. But it’s simpler to use Select tops that can pull the image from the corresponding panel component based on a simple expression, using the digits at the end of the name, and then combine that all into a certain set of grids. And voila, we have a grid that matches our defined led raster and we can output that on led on an NDI to receive in the media service. And that was a carousel. As a side note, when dealing with many screens, or very big screens, negative space becomes this topic. There’s a pushback against using it fully in corporate, many people can’t shift the black is wrong. Blacks a mistake mentality. But transparent products work best if you embrace it. Black is invisible, not a blank screen, and the other screens just become more important. A creative director and production designer understood this. And then they took the client into VR to show them why negative space works. And as you’ve seen, we also use that with the photographs with some extra processing. To make the frames of the panels disappear. The audience was seen floating headshots, not square screens. Another piece related to what was on the panels during the many hours of corporate presentation, black, some sort of still, what we decided to do is take a feed from the live slide presentation and process it to derive an interesting but not distracting image that the audience will see mainly on the mirrors around the LED screens. We call this the bokeh effect after the Japanese word for blur or haze. Again, touches on is replicated was key, refer to an SDI signal from the E two into the TouchDesigner server, cropped a piece of the image and applied transforms and blur until it suggested the current slide, but didn’t have any visible text or branding that we shouldn’t, we shouldn’t control and touches on his experimental module and he was critical. We could show 10 different looks to the creative director in 10 minutes, so she could flick through slides and help us find the look she wanted. For control, we bypass the need for control by just feeding the book as a separate signal to be used anytime we want it. When we also use the panel during the NAC party, to create an amazing neon garden of VJ style visuals, we also made a nice virtual spray can graffiti system with the side screens. Look out for a podcast episode covering that. So BMW NEC 20 through his huge hit with the audience and the client, storyteller was able to immerse the audience in images without a painful expanse and huge screens and huge pixel spaces to handle. From my point of view, splitting the media server programming from the generative pieces meant that we never held up rehearsals. So 25 years upon similar effects for the audience, very different approaches. And actually enjoyed putting this presentation together. And, and it made me think about the way I used to approach things versus the way I approach things now. Back then we didn’t even have all the choices we have now, probably were only a few ways of accomplishing Turner, Lucky Matt and I found one of the ways we were willing and able to push through and make it happen. Well, my process looks a bit more like this now. Research, really understand the problem, research tools that might help me solve the problem. And I can’t help hoping every time that I’ll find the perfect tool for one system, then I never find it. Test with some of the possible tools if I can maybe just switch promos or tutorials to see what’s possible. Research again, maybe hop off that help. And then more research and research. I tend to research read them to the point where not only is everyone around worried when I’m going to start implementing even I’m getting a bit worried about when I’m going to start implementing is used to bother me until I found a cool study that links it to effective problem solvers. And I’m done with that. However, relationship tip, don’t try that on home improvement projects. The more you research, and the more fully you understand the problem then when you do switch your head to that doing mode. Since you understand the problem better, you’ll have much more bandwidth to experiment. And, and that’s one of the joys of TouchDesigner the whole time you’re building in it. You’re also playing and experimenting and having happy accidents as you go. It’s a really fun tool for that. I think one of the other lessons that stuck with me is how to fine tune how to how to dial into the best performance you can get from any given system and also to give yourself enough Time to do that. Or for Matt to give you enough time to do that it’s the same thing. Everything I learned from Tony 25 years ago, influenced how I problem solve for BMW. So, hope you enjoyed hearing about this. As I say, We’re launching a podcast, we just published the initial episode that I think will interest you. Please use the QR code. We’ll go to backstage wizards.com, or look for backstage wizards in the usual places. It’s more coming in the next few weeks. If you drop us a line at hello to turritella.com. And you’ll be the first to know when we have new episodes. Or maybe we’ll ask you to be a guest. So we have some time for questions. Can I answer anything for anyone? Maybe not you Matt? sigh No, I can’t tell you how much easier it is to be that side of the screen and this side of the screen. That’s for sure.

Laura Frank  31:08

I like introducing this to our community. Yeah.

31:25

Thanks, Bruce. For 2023 for BMW, you mentioned that the creative team and production designer had to take the client through a VR experience. Were you still doing your research? And how were you able to help them convey what was or wasn’t possible?

Bruce Wheaton  31:48

Yeah. Yeah, we so we use storyteller uses Unreal Engine for that. Thank you for the question. Storyteller uses Unreal Engine for that. Luckily, Solly a production designer actually has some pretty good unreal chops. So I think I think I ducked out of that one. I’ll be happy to help you. If you have a project. I’ll come with you on reel for your client for you. Yeah, I had the hot bath. That was me. That was all me.

Laura Frank  32:24

What else?

32:38

In Bruce a wonderful job. Making TouchDesigner not look terrifying. I think you did a great job just giving great examples as to sort of the processes and how you use it. Curious how you sort of approach optimizing your own projects and stuff? Like, do you go through and try to create every option and then reduce back down to normal? Or do you like to stay in a very tried and true path? And how you want to get because I know it’s all experimenting? Like you said a lot of happy accidents? Right? How do you go about personally just optimizing your own systems?

Bruce Wheaton  33:10

Yeah, so let me reduce there was a there was a nice tool, you touched on it, which will show you how much CPU and GPU time each each piece is taking. I wouldn’t say I would say first of all touched on is incredibly efficient. And, and I don’t know, if you if you’re a programmer, if anyone’s programming here, the one of the things they tell you is, you know, don’t optimize too early don’t optimize for you need to sometimes just doing things the clearest way. And so I have like, some fairly deep knowledge of what’s going on in a GPU and GPU programming. And so every time I’d get clever, and say, Well, I ought to do it this way. Someone else will do it totally different way and touches on it. And that’s like a huge facet of touches on is that there are eight ways of doing every single task, someone else to do at some totally other way. And I’ll go Well, that can’t possibly work. And then I’ll check it out. And it’s running faster than my code. So. So I think I wait until there’s a problem. And then I try and like dig in and solve it. Yeah, on the whole with a fast enough machine. I only really get in trouble whenever we do like insane things. You know, like

34:14

you’re downsizing way too much. And

Bruce Wheaton  34:17

yeah. Thanks you though. And thank you for saying immediate trivial. I mean, like I said, touches on his learning curve when you first get into it, and someone shows it to you. I don’t know. I think there’s like a how to teach people TouchDesigner in the first day until you get your mouse scroll wheel, and just zoom in and out really fast and move around movement network around really fast than that. Yeah. So yeah, I tried not to do that. I think I think we can. We can release the embargo. I’m not asking questions to just because

Laura Frank  34:51

Are you sure about that?

Bruce Wheaton  34:52

Yeah. Why not?

34:56

Thanks, Bruce. Couple questions for you. Could you talk a little bit more about how you had a photographer downstairs taking photos of these people, you know, was that uploaded to a server locally that you were hooked into that the TouchDesigner server was hooked into? And you were pulling the images from there? And then Was there a reason that you chose to go with the NDI option of getting those textures into your media server? As opposed to running? Like a touch? Like the touch engine itself with on the media server?

Bruce Wheaton  35:30

Right? Okay. This is why I don’t know shopping. i Your first question was sorry, the

35:41

Galleries talking about how you work with the photographer’s Yeah, using a local server or so loud?

Bruce Wheaton  35:49

Yeah, I think. So we were in a hotel. We’re in the aria in Las Vegas. And it’s hypothetically possible to get a point to point link on a network in a hotel. But but it’s not, it’s not wise. So I just use Dropbox. I mean, we use Google Drive just as much for stuff. But I just use Dropbox and you and you have like a long conversation with the photographer. And then you have the conversation again, because there was forget some parts of it. They had a nice tool, which, which would output the photos. We didn’t we didn’t want them to do the black and white or resize it too much, which we want to drop box full of usable pictures. I think we were doing like 2048 by 2048. Color. And then and literally, we just picked those up backstage. Luckily, it wasn’t I mean, I have done some things on Facebook events where we’re trying to do like, do just scan his batch, find his picture, you know, and you’re trying to do it like extremely fast. This was not one of those cases, we were we were pretty happy with the fact that they had like gone up an escalator and yet, Dropbox would get it to us by the time they made it up the escalator. Touch engine. Yeah, I, I haven’t played with this too much. But I actually go into my point about an appropriate separation. What it meant is the gap, you know, my collaborator was was programming the media server. And I was programmed touches on it. So I didn’t really ever have to say like uploading new version or tweak or try not fight over the machine or anything. I guess if touch engines integration gets good enough that I could, you know, edit and work on it. At the same time it’s open somewhere else, then maybe I would consider it. But yeah, and the NDI is and we’ve we’ve done some tests. So someone will say different but I don’t personally have much of a problem with the quality on NDI. And what I love is that you get it on the network, and you can pick it up any way you want. So it’s good for and we and we and we sort of structure we structure our networks with like three or four different networks there so that with a lot of work, you can keep the heavyweight NDI, obviously this wasn’t. This was 672 by 672. So you wouldn’t call it heavyweight. But we had the VGA graphics were running at full resolution. And so we could keep those off on a 10 gig network and not worry about it affecting queues or anything. Sure.

Laura Frank  38:11

Anyone else? Matt? The floor is yours, man.

38:19

I’ve actually got a question too. So hey, hi. Hey, where are you running a backup instance of TouchDesigner? And did you have a failover scenario to manage the switching to an alternate NDI feed?

Bruce Wheaton  38:34

Yeah. So um, yes, we were running parallel backup service, Lisa closer is. And yeah, I think the backup would have been switching it over switching to a different NDI feed in disguise. I think you’d have to ask, I’m gonna say this. I expect that JP made a set of duplicate queues.

Laura Frank  39:02

I’ll ask a question show the solutions you would come up with? In the olden days 25 years ago, the the olden days of this world we live in and working. Are there some of the solutions of thinking the right words that were so elegant, you find you can’t craft modern pathways to achieve the same type of results?

Bruce Wheaton  39:28

Yeah, that’s it. The one that I realized in dealing with this is the routing we could do. So in Amman, SDI system, you’ve probably got a nice broadcast production switcher, and it probably has 10 to 15 outputs. And you can put any of those on an aux bus and hypothetically you could attach control to route it. But probably the router in your system which is probably like 64 by 64 These days, won’t be able to do an accept diddly clean switch. It couldn’t Do what we were doing which is the salvos course, the Pacer routes and we were doing this I was on couldn’t do the salvos we were trying to do either but but at least when it when it did it and if it managed to not crash, it would be do like a clean, you know, cut from the so I feel like I missed that. You know, now you have to put etoos and spiders and all these like hyper expensive boxes on things and it’s like, it should be easy, you know? So I would say I miss that. I’m not sure I miss Traxxas timeline. I really do. I mentioned in vertex. Don’t Don’t tell them

Laura Frank  40:33

anyone else? this is the place to do these things. Okay. Bruce, I’ll say thank you very much. We have a little extra time here. Our next session starts five minutes after three o’clock. You can spend some personal time with Bruce out in the lobby if you want to ask those more detailed and boring questions. Thank you very much. Thank you, Bruce.

SUMMARY KEYWORDS

screens, touches, outputs, images, panels, ndi, matt, bmw, led, routed, switch, queues, feed, server, production designer, video, audience, work, system, router

SPEAKERS

Bruce Wheaton, Laura Frank