
#Holotype #Programming #GameDesign #Python #BoardGames #Science
Summary
A follow-up to our last episode, in this episode we interview Brett Harrison, one of the co-creators of Holotype. Specifically, Brett is the one who programmed a computer to play Holotype against itself 10,000 times per minute in order to precisely balance the point values of different parts of the game. In this interview we'll cover a bit of the background behind Holotype, why he did that sort of optimization
Timestamps
- 00:00 - Introductions
- 02:13 - Design of Holotype
- 03:54 - Playtesting and computer simulation
- 11:37 - Designing the right AIs
- 14:50 - What parts got tweaked
- 18:35 - Game length, Bone Wars, and an app
- 21:20 - Lost mechanics, accuracy, and personal favorites
- 25:15 - Closing remarks
Find our socials at https://www.gamingwithscience.net
Links
Brexwerx Games: https://www.brexwerxgames.com/
This episode of Gaming with Science™ was produced with the help of the University of Georgia and is distributed under a Creative Commons Attribution-Noncommercial (CC BY-NC 4.0) license.
Full Transcript
(Some platforms truncate the transcript due to length restrictions. If so, you can always find the full transcript on https://www.gamingwithscience.net/ )
Brian 0:06
hello and welcome to the gaming with science podcast where we talk about the science behind some of your favorite games.
Jason 0:12
Today, we'll be interviewing Brett Harrison from Brexwerx Games. All right, welcome back to gaming with science. This is Jason.
Brian 0:22
This is Brian,
Jason 0:23
and today we have another special guest. This is Brett Harrison from Brexwerx Games. Brett, can you introduce yourself?
Brett 0:29
Hi, yeah. Brett Harrison, Brexwerx Games, one of the designers on holotype, which is our board game about paleontology,
Jason 0:35
yes. And when this episode drops, about two weeks before we will just put out our episode on holotype itself, we we're talking with some paleontologists, but this was because I wanted to ask some more about what you did for the game. So our listeners are already familiar with the game itself, how it plays the science in it, but I wanted to really follow up with an aspect I learned about when I was doing research for that episode about you created a computer simulation to basically play test the game, and we'll get to that in a moment, but I think we need to learn a little bit more about you first. So can you give us, like, what's your background? Like, how did you get into making board games, paleontology, that sort of thing?
Brett 1:13
Yeah. So I've always been into paleontology since I was a little boy. My grandfather brought me a book from the Los Angeles library, which I still have, so it's incredibly overdue, but it was all about dinosaurs, so he would read it to me when I was like six, and that's how I kind of got hooked on it. And then I eventually went to college for computer science and needing to get electives and stuff, I always chose paleontology or geology and stuff like that. So that's where I get that background always been a gamer since really young, started out with like, axis and allies, and went all up to all the Euros and everything else that's out there. So really big into board gaming. I've designed games, computer games and board games in the past, but nothing of this nature until COVID hit, and me and my buddy were like, Let's make a board game. So that's when we started working on holotype. And the theme was dinosaurs from the beginning, because I was so into paleontology, but we also want to make it super scientifically, you know, as accurate as possible within something that's supposed to be a fun game mechanic
Brian 2:11
That was really obvious. So we played with a couple of paleontologists for our episode on holotype. They really appreciated the attention to detail, and so do I, because I think it's really important for a science game that, like, accuracy, at least, is you can never have it fully accurate, right? You always have to decide how much you're going to simulate and like, where is the fun point? Where are you going to lose fun for the sake of accuracy, that you guys did a really good job, maybe if you do another game for the Patreon, you can have a stretch goal to pay those overdue library fees.
Brett 2:40
All right, yeah, I just, I'm never gonna ask you if it's overdue.
Jason 2:44
Well, I think you succeeded as far as, like, the fun and the accuracy, because our listeners will have already heard at the end of our episodes, we basically give a grade for both the fun of the game and the, like, the science in the game. And you got A's on both of them, like, right up there with some of our favorites, like wingspan and cytosis. So y'all did very well,
Brett 3:03
awesome. Yeah, and those kind of, there's so many dinosaur games out there that are, like, you know, the theme park and dinosaurs taking over and all that kind of stuff. It was like, we wanted to make something that was, we called our love letter to paleontology. And we did work with paleontologists to make sure it was accurate. We worked with the Southwest paleontological group to, you know, kind of get those resources of having a PhD, go over and check and there's stuff that got changed in the game because it was not accurate, because ornithophods, like, I guess, stegosaurs and ankylosaurs are not technically ornithopods. So we had to go with a much more inclusive clade that included them and the ceratopsians, which is why there's genasaurs. Those used to be called ornithopods, and that was not correct.
Brian 3:44
So yeah, I know I didn't know the term either for sure, that was a surprise, and we actually talked about that a little bit as like, What the heck is a genasaur? But they appreciated the use of the term. They said this is a better term.
Jason 3:54
So I now want to drill down to the whole reason why I wanted to get you on this podcast is because while I was looking up information on this game, I ran across some of your other promos, interviews, and you mentioned the play testing of this. And can you walk us through the play testing process? Like I'm really interested in this computer part, but how did you play test the game in all its aspects? And then we'll focus on the computer.
Brian 4:16
Let me give some context for why Jason is excited about this. He is a programmer. I am not, so I'm gonna sit here and listen politely.
Jason 4:22
Yeah, so I'm a bioinformatician, so I study, I use computer programming to answer biology questions. I have this long running goal of having a little like, a few minisodes about teaching computers to play games, which is kind of what you did. So I'm really curious about this.
Brett 4:37
So play testing, I mean, it started very early on in the process of like, okay, these are the mechanics that we want to do, and this is all done with me prototyping ridiculously bad computer programmer graphics on paper and cutting them out and gluing them on cards and all that kind of stuff. And granted, this all took place during the pandemic, so our normal gaming group that would normally get together that wasn't as. Possible, but we'd have a set group of about four people that started play testing it early on in that form, and that's where you learn stuff that's fun, that's not fun, that is completely, you know, seemed like a great idea, but once you get it in practice, it makes the game 20 minutes longer, that kind of stuff. So all that kind of play testing took place within a very small group to break out those, you know, core mechanics and then we went to using a program called Tabletop Simulator on steam to do our virtual play testing, because that way we could get people from all over the country playing it. We didn't have to get people together, because we couldn't with COVID, that kind of thing. So that was basically, there's no real automation in that one. It's just replicate all the components so people can play in a virtual table. And so with that, we again fine tuned more of the mechanics, and then once all the mechanics were kind of solidified and we had our this is fun. It plays fast. The loop's, right. That's when I went into doing the simulation in Python that would actually play the game, so that I could do all the balancing that was necessary to make sure it was really tight for victory points, and there wasn't one strategy or card that threw the balance out.
Jason 6:08
Okay? So this sounds like this was the late stage polishing of the game. Basically, you you got the big things done with playing with other people, either in person, online, but then this is when it's like, okay, let's fiddle all the little knobs and tweaks to try to just get it nice and smooth. Is that basically, right?
Brett 6:23
Yeah. And the reason behind that is, like, the simulation can't tell you if it's fun. So that would be just like, too early in the process, because there were things it's like, I would have programmed that we just threw out, because it was like, yeah, that's not a fun mechanic or, you know, so it had to be at that stage for it to be useful, and then it becomes incredibly useful for balancing and determining, even to the point of, like, manufacturing the amount of little wooden cubes that are in the thing that increases weight. Each one of those costs a penny. So how many of those do I include of each type? And that's basically I can calculate during a play test. But I want that over, you know, 10,000 play tests to figure out what's the maximum that's ever in use for this thing, because then I can go, okay, that's what we manufacture, and no one will ever run out and we don't over produce.
Jason 7:10
I hadn't even thought about that. I was thinking pure game mechanics, not even game components, and how many cubes do we need?
Brett 7:16
And that every penny adds up to five cents, which adds up to weight and shipping it over from the manufacturer and all that kind of stuff. So...
Jason 7:23
How hard was it to code the game into a computer? I've tried this once or twice, and quickly realized that even a relatively simple game actually has a lot of moving parts when you get down to computer code, like, how hard was that to actually instantiate it as a digital thing?
Brett 7:37
Let's see. I did it in Python. So coding wise, wasn't too difficult, just because that's fairly easy to code in. The game mechanics being finalized made it a lot easier to replicate what was actually happening in the game. So the main thing was just getting the game states, having all the cards, their values, is basically building that database of what are all the game components, what are their values within the game? And then I could move to actually making the player AI that would play that game, which is basically just manipulate the game state. The other thing is, because it's a simulation, I don't have to worry about player input and all that kind of stuff, you know. So that takes a lot of the UI out of making something like that.
Jason 8:12
Yeah. And for listeners not familiar with the term, UI is user interface. It's basically how you interface with the computer. It's the thing that looks pretty and that is meant for you to do. And if the computer is just running with itself, it doesn't need that, because it's just talking to itself with code.
Brett 8:27
There was a UI in the sense of, I needed to have a graphical interface that would show what's happening so I could just see how the players were behaving. Was it actually playing correctly? Or had I coded something, you know, wrong, instead of just looking at the data, I wanted to see the visual. You know what was going on, but that's very just crude graphics. And really, you know, nothing you'd want people seeing. But it was basically a sanity check of like, is this actually playing the game correctly?
Brian 8:51
We were joking. It seems like Holotype might have been played more times than any other game just because of the simulation time.
Brett 8:58
That's a good that's and interesting point because I ran my basic run of the simulation. Is once everything was working and doing what it was supposed to do, I would turn the graphic part of it off, and it would play basically 50,000 simulations of the game in about a minute. So that would spit out a it took me way longer to deal with getting that data into Excel and making reports that, you know, gave me some interesting info than it did to actually run the simulation once everything was said and done, and any time that something would be shown that was like, Oh, I that should be, you know, modified a little bit. I could just go in, go into the JSON file, change the value of a card, or change the, you know, point value of something, and run that simulation again right away to get another output to be, "Did that change it? Did that go in the direction we wanted?" that kind of stuff. So yeah, to say it was run 50,000 times. It was run 50,000 x 100 appear.
Jason 9:50
So like, now that I've heard this, this sounds like an obvious thing to do for trying to balance and play test the game, but I can't say I've ever heard of another game actually doing this. I hear of open betas, I hear of people doing stuff online, digital versions of the game for people to play, but I haven't heard of someone just having the computer play itself a whole bunch. Was this an original idea? Did you hear about this from somewhere else?
Brett 10:11
I didn't hear about it from someone else. I just because of my programming background. And it wasn't, I don't think it was something we set out to do from the beginning. It was just as the design was coming together, and I was like, I want to make sure this is balanced. I went, Oh, I can easily do that myself and play test it way more than asking a bunch of people to play test it to get those, you know, real true averages over a crazy amount of time. And I think it would be useful for other board games and stuff like that. It's because I was the programmer and one of the designers, I knew the game so well that it was probably easier for me to implement that once I explained it, I had other designers that were like, Hey, can you do that for my game? And I was like, uh, you know, I could, but do I want to? So I think it would be useful, because that's one of my frustrations sometimes when playing games, is you'll play it, and then someone will find a strategy just like, Okay, that's all you ever do, and you'll just win, because it's just unbalanced. Through that one thing I've had, I've had games where I was like, Oh, I figured it out, and then it was like, there was no fun for the other people, because someone was just blowing other people away because they played a certain way. And that's something that we really wanted to have holotype be super balanced. And I think a lot of the feedback we've gotten has been that, yeah, it's incredibly balanced, especially point wise at the end. There's been plenty of times where people have, you know, had to go to the tie breaker because they were right in the, you know, same area, point wise and stuff.
Brian 11:29
So we talk about a game being solved right, where there's always an obvious choice to make. It sounds like you were working to keep the game from being solved.
Brett 11:37
Yeah. And there's, there's an interesting point there that goes to designing the AIs. When I made the AI to play the game, it had to play it the way a human would play it, because sometimes gamers aren't optimal. I didn't want to make the AI figure out like, this is the best way to play this game and optimize it for something players never do. So it was, it was definitely more of like, okay, this is the way the players play. It's weighted to try different strategies and stuff, but it was very much informed by how people would play the board game. If that makes sense.
Jason 12:08
What does that look like? Because that was the thing I was really curious about, is, how did you program an AI to essentially fill the role of a human? Because when I'm playing this game like I'm looking at my cards, I'm looking at my personal goal. I'm looking at the global objectives. I'm looking at other people's stuff. Obviously, a computer can keep track of that pretty easily. But then making, like, integrating that and making the decision of, what should I do next? That even I don't know how I make that decision sometimes. So how did you represent that for an AI to make that decision.
Brett 12:34
It's represented in basically the way you would as a player. So you're looking at your cards. You have, I have a, you know, theropod from the Triassic. I have this from the Jurassic. I have this many resources of that, and I have a personal objective that I'm going for. So those actions are all given weights based on what's in your hand and what resources you have, even to the point of like, okay, I have, I have these two Jurassic cards, but this one's worth more, but it also costs less in resources. So all those things are just weighted, and then it's basically making a decision on which one of those actions has the highest value, which eventually really just relates to which of these actions is going to get me more the most points at the end of the game. And then, if I want to test a different strategy, basically you're just changing those weights like this, AI is going to want to publish globals more than the other ones do. That kind of thing. And then at the end, which AI wins more of the time, which actions, that kind of thing. And then that leads to just more the balancing.
Brian 13:27
Okay, so your your AIs have some biases in them, built in some preferences.
Brett 13:32
Preferences, and you can change those to, you know, investigate other assumptions. Like, one of the things as we were playing was like, certain players were like, I never go for globals, or I always go for global and then so by doing AIs that did different preferences, you could see like, is that a better strategy? Is this strategy not even viable if someone just does this? So that allows you to test that over 1000s and 1000s of plays, which you just can't quite get in the plane with other humans, because people just have their own preferences on how they play. We want to make sure that if they did have those preferences, did that mean they always lost or always won? They'll balance it.
Brian 14:05
So, you've got the AI Jason who always wants to play optimally, and the AI Brian who wants to collect the cool dinosaurs,
Brett 14:11
yeah, and that, and that was one of the one of the AIS is just like, whatever cards the first in my list is what I do. I call it my eight year old AI, she's just like, what's the first one? I that's what I'm going for.
Jason 14:25
Did you try to make it so that that AI could occasionally win? Or was that sort of, like, the control AI of you have to beat this one to be considered a decent AI.
Brett 14:32
There's definitely the control of just like, this is the most simple thing. Does that one ever win out of 50,000 play throughs, maybe a couple times. But you didn't want that one to have, you know, the same odds of winning as someone who's actually, like, paying attention to what's happening, going and looking for certain things, for their personal objective and that kind of thing.
Jason 14:50
And then, as you were using this to polish everything, what aspects of the game did you try to tweak? I mean, I was looking over the box and, like, I can think of dozens of things you could try to do. Places on the board, the number of things on each face of the dice, the how long it takes to get a graduate student, the point values on the cards. Obviously, you already mentioned those. Did you make a list of like, here's everything in the game that can be tweaked. I'm not going to systematically test all of these. Or were there some that he's like, Okay, I need to tweak these. These are not so important. Like, how did you decide what to check?
Brett 15:20
So there's certain parts of the game design that are just like, once you've decided it, that's what influences everything else, and that that comes down to, like, what is an action worth in an action economy game, and what is a resource worth? So there was basically before any of this simulation in Python, all that stuff was in Excel, so that it was like, Okay, if I have, if this is how many pips I have on a, you know, purple die, what's the distribution of that and probabilities that all these things happen so that I know when you roll this many dice, this is the average pips you'll get of this type of resource, those things kind of had to be like baseline, and you don't mess with those once you get into the simulation, because that was just core mechanics. And really all that determining is, what are those values on the cards? How does that turn into a victory point? And that's where the simulation so it was all the values on the cards, which comes from the point values on the specimen cards, and the point spreads on the the personal objectives and the global objectives. So those were the main things. It was like balance the point values for all that stuff using these simulations. So that was the big one. And then some of the stuff that got changed is personal objectives. Definitely got changed, because there would be certain personal objectives based on the distribution of those specimens in the in the deck. How often would this person win with this personal objective? How much did that skew? Same thing with global objectives. How often did the game end early, because these global objective cards got filled. So those values were the ones that got the major changes from the simulation, as well as the like. Now we know you only need 35 Triassic cubes out of 50,000 playthroughs. Those only ran out 50 times that kind of thing.
Jason 16:54
All right. and so looking at what you just said, so the global objectives, personal objectives, specimen cards, those are the ones where players have a choice. Basically the like the excavation cards, the dice rolls, those things are outside of player control. So it sounds like you set those as these are just aspects of the game you have to deal with, and it's the ones where players can choose where you were fiddling with things. Is that right?
Brett 17:15
Yeah, that's correct. Now, some of the like generation cards, like the field expeditions, those values could be tweaked too to test that kind of stuff. But those things were typically set pretty early on, like the amount of cards of each type in the deck. That kind of stuff was kind of hard coded. That it was going to be the 10 Triassic, 20 Jurassic, 30 Cretaceous that went back to early design, because it was just like there weren't many dinosaur type in the Triassic. Then they started, you know, being more in the Jurassic, and then even more in the Cretaceous, so that that became a constraint of, like, constraint of like, we can't have more than 10 Triassic dinosaur cards because there weren't enough specimens that would be easy for us to import into the game.
Brian 17:49
Now I regret that I overlooked that. I really wish I'd noticed that specific pattern and pointed it out, but I guess for our listeners, they found it up now, so no worries.
Brett 17:57
Yeah, and if you notice on the on the pips on the dice, purple Triassic dice only have six the blue Jurassic dice only have eight, and the Cretaceous dice have the green ones have 10.
Brian 18:08
The lure of the Triassic was just too strong for me. I knew it was more valuable, and I kept chasing it and would lose. So I was one of those bad AIs,
Brett 18:16
and sometimes it worked out. You never know,
Jason 18:18
yeah, but it was funny watching him get so frustrated as the paleontologists we're playing with would just keep tossing these Triassic fossils, Triassic specimens to go after the cool dinosaurs they really like.
Brett 18:28
There's the personal preference AI, the one that only wants to publish Triceratops. Yeah, we had one of those.
Jason 18:35
Also, there's a question with all of your hundreds ot thousands or millions of playthroughs, you might be able to ask Brian and I, every time we play this game, we have the question, how long does a game last? Like, how many actions does each player get before the game ends? Because we're kind of thinking the metaphor of like, okay, this is the paleontologist's career. Is each one of these actions a month, a semester, a year? It means nothing for the game. But we're curious. So how long does a game usually last? How many actions does each person get?
Brian 19:05
Good question
Brett 19:05
that's interesting. Now, roughly about 40 during a game, and it's going to change depending on there's different variants, right? You can play the longer, shorter game by using the different player tracks. So there's been times where, you know, I'll play a two player game with the five player board, just because I want to publish a ton. But the average amount of publishes for each player is around seven to eight by the time you're done with the things, I'm sure a paleontologist could answer this best, but, but I expect that, you know, maybe that's a decade or more research if you're publishing, you know, eight times. Because when you're going out in the field and discovering stuff, it doesn't definitely takes a while. But then if you go back, you know, way back to the bone War era Marsh, and cope, they were pumping out new ones like you know,
Brian 19:41
that makes me think that the best analogy to overlay on a paleontologist's career for 40 rounds is 40 years, because I think there are plenty of paleontologists who are still working into their 70s.
Jason 19:51
yeah, but you hope that they published more than eight times in those forty years
Brian 19:54
you would again. No simulation is perfect, right?
Jason 19:57
Yeah, although you mentioned the Bone Wars. That was the number one request from our paleontologist friends, is that they want the Bone Wars expansion to the game.
Brian 20:05
Yeah, I want the Bone Wars expansion. I want the traitor mechanics.
Brett 20:08
Yeah, that's definitely been asked for, for sure, where that's like, I'm going to steal your fossils, or that, or I'm going to, you know, denounce your spec, your publication, because that's wrong, and now mine becomes the better one.
Brian 20:20
Yeah, you need a fourth type of meeple, the, I don't know,
Jason 20:23
the saboteur? the infiltrator?
Brian 20:25
The saboteur is probably the best way to put it.
Brett 20:27
I like it
Jason 20:28
someone with like the cloak a fedora and like little spy mask,
Brian 20:32
since they have to be they have to be indicated by the type of hat.
Jason 20:36
So Brian, do you have any last questions you want to ask? Brett,
Brian 20:39
I did so one thing I was curious about, so a lot of board games now, wingspan evolution and a lot of other ones root to have these digital components to them. There's an app you can play wingspan on. It's a representation of the game. Lots of games have made their way to steam. It seems like you've done all the work on the back end to do that for holotype. Is there any interest in pushing it to an app?
Brett 20:58
Yeah, I guess so. That just seems like more programming work for me, which I enjoy doing, but I don't have a lot of experience with App Stores and getting and doing stuff like that. Mine's more on just the back end, spitting out all the stuff. But I know there's, you know, companies out there that will take those games and make them look beautiful and nice and playable on a on an app store. So we'd be more than interested in that.
Brian 21:20
I had one other question, and you don't have to answer, but I'm curious, if you want to give a look behind the curtain, what was the mechanic you guys tossed?
Brett 21:27
Oh, this is a great one. It was a horrible one. That's why I got tossed. So all the milestones that you get during the game. So adding, you know, extra research or another, every time you hit one of those green stars, you add a milestone to your board to get a little better at something. Those used to be in the game. Now, every player has access to the same set of those in the original game. They didn't
Brian 21:49
there was zero sum?
Brett 21:50
yeah. So there was the number of players minus one available. Oh, and and our brilliant catch up mechanic was okay, when you hit those green stars, everybody adds up their current points. Whoever's in last place gets to choose first.
Brian 22:06
So that's a little bit too close to real life.
Brett 22:09
Yeah. And, and the problem is, while that was cool in the thought of like, Oh, it's a catch up mechanic, someone could snag something that someone else, the leader would get last pick, and usually get something, you know, less good. It made the games take a ridiculous amount of time longer, because that whole scoring process you would have to do four times throughout the game. Then you'd have the picking and someone waiting to be like, do I get this? Do I get this? That indecision? And we wanted a really fast, snappy playing game so that that one had to get thrown out. And I think it got thrown out for the better, because now everybody having their own set, it's easy. You know, when's you know, you can make your choice on your time, and it makes a little more fair. Everybody's got the same thing. You can't complain that someone else snagged this before you.
Brian 22:47
I'd be really sad if I didn't get the storage closet for the extra fossils.
Jason 22:51
All of us always chose the storage closet first because we're hoarders.
Brett 22:56
That's interesting. Yeah, that's that's super interesting because that's one of the ones I it's very subjective to me on whether I choose that one or not, like, how many fossils are currently flooding my storage?
Jason 23:04
Was that digitally play tested? Did you change the AI's preference for which one of those it would go for first
Brett 23:10
in a couple states? Yeah, because there's only six in the base game, that was a fairly easy thing. I'm going to get four of them throughout the game. I'm just going to randomly pick them. Or this one's picked always first. Does that, you know, skew it too much. So that wasn't too hard to test. And it's funny trying to make those, you know, game things into like, how does this, you know, relate to the field of paleontology? Because, like, I spent a lot of time on making sure, like, all the global objectives had meaningful names within paleontology. And people that play in the game just don't care. Paleontologists, they're, like, systematic what is that? I don't care
we have, like, systematics, cladistics, and phylogeny, and one of the paleontologists broke down the differences between them, and at a certain point I just, I like, yeah, okay, um, sure.
We wanted to be true to it, so that someone that was really into it was like, Oh, I really appreciate same thing with, like, the colors of the dice being related to the international stratigraphy chart, that kind of stuff. It's like, that's an unnecessary step, but people sure appreciate it when they learn about it or know about it.
Brian 24:05
We were trying to figure out why only one person can publish per turn.
Brett 24:09
That's purely game mechanics. We wanted the one spot that was going to force a lot of, you know, bumping mechanic going on.
Brian 24:15
We retconned that there was a departmental copy editor or the press release person. They only had so much time.
Brett 24:20
I like it. Yeah, there's, there's there's certain things where it's like when we played with the paleontologist, like we realize research is not as easy as just going to the University library. Please, please take this as a you know, conceit to game mechanics.
Jason 24:33
One last question. This is the million dollar question, what is your favorite dinosaur in the game?
Brett 24:40
Oh my gosh. So Well, my favorite dinosaur would be Stegosaurus. My favorite creature from the mesozoic is a marine reptile, which is the Mosasaur,
Brian 24:50
anyone in particular?
Brett 24:51
oh my gosh, probably a tylosaurus.
Brian 24:53
I don't know if I have a favorite. I was a sucker for ichthyosaurs personally. But my favorite dinosaur is parasarolophus.
Brett 24:59
Those. Are pretty awesome. I'll have to send you one of our Parasaurolophus pins.
Brian 25:03
That would be, yes, you can do that. I will allow it.
Jason 25:08
Oh, yeah, just twist his arm there.
Brian 25:10
Yeah, we'll have to send you stuff too, of course. Thank you awesome. We'll do an exchange.
Jason 25:15
All right, that's a good place to wrap it up. Brett, thanks so much for coming on.
Brett 25:19
My pleasure.
Jason 25:20
We've already told people a lot about holotype, but if people want to look you, to look you up or other than advertising your game, is there anything else you want to talk about, other stuff that may be in the works from brexwerxs, or other things you want to mention?
Brett 25:30
Yeah, we've got some stuff in the works, but you can always check us out on brexwerxs games.com we just released our 8-legged Peacock , which is a game about the Maratus peacock spiders from Australia and their mating dance.
Brian 25:41
we saw that. I'm excited about that. That might be what something we put into one of our light games,
Brett 25:47
light science game.
Jason 25:48
Well thank you so much. This has been great, and it was really nice to have these questions answered that I was telling Brett beforehand, like I was thinking, Man, I wish I could just know the reasoning behind, like, wait a second, I run a podcast? I can just ask him and get him on the air that's content.
Brian 26:06
Thanks for being our first creator willing to come on and talk to us about their game.
Brett 26:09
no problem anytime.
Jason 26:11
All right. Thank you so much. Brett, good luck with everything. And I guess you know, if your current job doesn't work out, you can always start up a new job as a a board game designer, play tester, because you apparently have a skill set in short supply.
Brian 26:24
And if there's anything new from brexwerxs, send us an email. We'll make sure that we promote it for you.
Brett 26:29
Sounds good. Thank you so much, guys. And for our listeners,
Jason 26:31
thank you for tuning in. We hope you enjoyed this first interview we've done, and until next time, have a good week and happy gaming. Have
Brian 26:37
fun playing dice with the universe. See ya.
Jason 26:41
This has been the Gaming with Science Podcast copyright 2025 listeners are free to reuse this recording for any non commercial purpose, as long as credit is given to Gaming with science. This podcast is produced with support from the University of Georgia. All opinions are those of the hosts, and do not imply endorsement by the sponsors. If you wish to purchase any of the games we talked about, we encourage you to do so through your friendly local game store. Thank you and have fun playing dice with the universe.
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.