They Keep killing each other…

I re-wrote a bunch of the patrol bot AI to use EQS (Environmental query system: see https://dev.epicgames.com/documentation/en-us/unreal-engine/environment-query-system-quick-start-in-unreal-engine#2-environmentquerycontext ) , which should greatly simplify the the bots and give them better criteria to find spots that see the player, but are also kinda close.

Here’s the EQS Tree:

Essentially EQS is a way to break up the navigation grid into nodes, each has a score and the EQS tree determines the score of each node. (Lower the better). In this case I have two tests for each node: Can the node see the current location of the player? Am I close to the player? Here’s another video of things kinda working but not fully:

I also fixed up the rocket so now it wont fly through walls

Here’s some quick gameplay of what I have atm:

It’s very silent right now…Also the stabby bots are quick as hell and I want some kinda warning for that. The gameplay right now is very serious sam and I want to tone it back speed wise. I think I might just slow down all the characters and add it some head bob.

Longer gameplay (with sound) below.

The map I was playing on isn’t great but I figure it might be better than that dev arena that I had before. I’m in the process of re-making the warehouse as a big static mesh so it’s easier on lower performance machines (right now it’s like 200+ meshes).

My 4090 fan is starting to spin up more often so I gotta feeling I might be hitting the limit of my “optimize later” strategy.

Build build build build

I made a real spawn location for the patrol bots:

Isn’t too much to it, just a bunch of boxes and a conveyor that spits out the bots.

I reworked some of the deployment logic so that bots are physics objects before they get fully deployed. That way I can do stuff like drop them off cliffs after being built.

The deployment logic now hardcodes the mesh’s position and rotation before playing the deploy animation (which is why you see them jump up in the second video). A way I can fix/clean this up is by adding a rollover animation which will play if the mesh is disoriented.

Right now the assemblers are attached to timers, when I get this hooked up to the survival game mode I’ll add control back to the game mode to dump out bots as needed.

I also added this guy in game

Doesn’t look great but I want to have one of these at the side of each assembler and allow some kind of “override” command or mini-game to stop the flow of bots or to flip them to your side.

Rockets…man

Gameplay isn’t done yet but I got some paint setup on the rocket launcher and the rocket.

Threw a bunch of random phrases+numbers on it, still looks pretty cartoon-y (but so does everything atm).

Next step is to get the firing + reload sequence setup.

Misc.

Added a parameter to scale the grenade explosion radius so you kinda see where you’ll get hit:

Its setup via user parameters and HLSL which I havent used in a while (I was fluent in the xna days, but that’s pushing 10+ years ago now)

I might dig into this more just to get my feet wet again with custom shader code. Here’s the blueprint setup I had to do in order to get this working properly

Nothing crazy but I always treat actors as these sacred classes that I want to minimize. Therefore when I wanted the particle system to stay put after the grenade actor destroyed itself I started going down a detachment rabbit hole. But after some googling I realized I should stop being scared of spamming actors for whatever I need. In this case I made an actor that is JUST the grenade explosion holder.

I also started modeling this guy:

Which I wanted to make a “large” version of the patrol bot that has rockets on it, then I wanted to get a laser setup coming out the front. I’m moving towards the idea that each patrol bot has to reload after ANY burst, otherwise the game probably will be way to hard.

Probably will push more on this friday/saturday/sunday to get the new bot in place. I also want to retool the flashbang grenade bots to hold the flashbang in its “hand” as it goes off and reload. No real reason other than I think that will be sillier and give a distinction to the frag grenade bots.

Still need to make more maps, still need to make a higher level meta game, still need to add more audio. Uhhhggg I probably should make a trello board…

UE Control rigs and Custom Numpad setup

I kept looking at the last video I posted and got really annoyed at how bad the animations of the patrol bot looked. This drove me to remake the animation rig for the patrol bot (to be much, much simpler) and I started making a control rig in unreal to push the animations there (see https://dev.epicgames.com/documentation/en-us/unreal-engine/how-to-create-control-rigs-in-unreal-engine).

Now why use Control rigs? Honestly not too sure yet… My hope is that IK is easier (so I dont have to add IK bones) but adding these controls seems tedious, in addition your forward solve block I guess has to be a big tower of blueprint power?

I’m probably missing the plot here, but also my “forward solve” seems to be undoing my “backwards solve” where I assumed “backwards solve” was used for IK situations. I’m still playing around but the hope is that by using unreal internals I should be able to handle the physicality of the bots a bit better than I expect.

Also I had an gmmk numpad (https://www.gloriousgaming.com/products/gmmk-numpad-keyboard?srsltid=AfmBOoqt9KEojE6tva-cmlDcTDtw1XiBNFEktoFWoobeNvWKYGD8ZtL0 I got it on sale for $90 but I think its still crazy overpriced) which I wanted to use for ableton and never got working (the slider plus the knob weren’t configurable to midi in). So I installed a via flavor of QMK ( https://caniusevia.com/ ) which lets me configure my keyboard in browser.

Which would be amazing if: 1.) Firefox supported usbhid and 2.) If I could remap the slider. But right now its really good for me using OBS to record rather than the snipping tool which records at like 20fps.

So for example here’s the control rig I described in action, BUT there’s no OBS window visible!

Otherwise I think I’m still dead-set on remaking the patrol bot animations in unreal. Walking and reload might be the most annoying but mostly I want to be able to make quick animations without the huge headache of .fbx exporting and importing (even with the blender unreal addons https://www.unrealengine.com/en-US/blog/download-our-new-blender-addons , which work amazing for static meshes, kinda sketchy for skeletal). I kinda wish unreal had the option of slicing the animation list of an FBX and attempting bone resolution before importing. I really want to get this working because then my workflow stays in unreal for animating. Blender is still blows unreal out of the water for making meshes IMO but animations in Blender still seem hacky with the action editor.

There are a few things I’m actively annoyed with when it comes to control rigs (which I wont show you here because I’m still WIP with this one.) I’m also a straight up control rig novice so I bet as I learn these problems might be solved with better practices.

1.) You cant manipulate the control shapes in the editor preview window. Seems like that would be an easy addition and should match the same kind of workflow as the “hold control to bump physics asset” thing.

2.) Control rigs are effected by bones. This one I get WHY but it seems counter-intuitive that you would ever want to make a rig that is controlled by a parent bone. I get the idea of attaching to a bone in order to have like a small sub object (for example a turret on a large ship).

3.) When you add a control to a bone it adds it to the children of that Bone. This would be fine if #2 wasn’t a thing.

4.) Adding Default forward solve assignments are not automated. I bet I could find a python script to do this but still, that blueprint tower of power really can and should be made upon generating a new control for a bone.

Still gonna push ahead with the control rigs though…Modular rigs freak me out (https://dev.epicgames.com/community/learning/tutorials/Dd31/unreal-engine-modular-control-rig-rigging-with-modules) and seem to be used primarily for bipedal skeletons.

Some Texture Painting / Raspberry Pi fun

Attempted to add some color to the zapper I showed earlier. I’m not 100% enthusiastic about the job I did but I still like some of the ideas I have here.

For the record this is how you setup a shader in blender for texture Paint:

Essentially you make a shader that you like as your “base”, then make an image that is zero alpha. Then you tie the alpha of the texture into a color mix node, that way when you paint on the texture it will swap in the information on the image to the shader. If I wanted to get REALLY creative here I would add in something like a chipping algorithm based upon the tangent of the base model so you would get a “worn” look to everything.

The potato cannon button I think came out fine, the wires could have used a bit more slack (or maybe some stables holding it down).

The front of the cannon I tried adding some scorching but honestly I botched that portion so it looks more like someone dipped the front in soot and smeared it back.

The bat handle is a bit too cartoony. The wrap needed to be tighter but I already applied the screw modifier onto the object so I was stuck with this. I’ll probably remake this if I keep the same idea.

The shoulder brace bike tire I’m weirdly happy with (minus the un-beveled edges), making a tire is surprisingly difficult in blender (for me atleast).

The pylons in the back I think look kinda cool, but they seem crazy out of place to the rest of the weapon. They don’t have that “junkyard” kinda look I was going for (also without lighting its hard to see the green emissions)

The can and the junction box are fine, I’d want to add a label to the can and some screws to the junction box.

In other news I spent 3 hours debugging my asus bt500 (https://www.asus.com/us/networking-iot-servers/adapters/all-series/usb-bt500/) on a raspberry pi 5 so I could get my xbox controller hooked up to run steamlink (which recently was released for the raspberry pi 5 arch https://help.steampowered.com/en/faqs/view/6424-467A-31D9-C6CB). I’m using this kit https://www.amazon.com/CanaKit-Raspberry-Starter-Kit-PRO/dp/B0CRSNCJ6Y/ref=asc_df_B0CRSNCJ6Y?mcid=499475e052c83be5a802a944f85cf088&tag=hyprod-20&linkCode=df0&hvadid=693601922380&hvpos=&hvnetw=g&hvrand=8182359702763456621&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9002000&hvtargid=pla-2281722246870&psc=1 which I got on sale at microcenter. My thought was getting a fan would be better for long sessions of video decoding on steamlink.

I only wanted a bluetooth adapter because my xbox controller would have crazy delay to the raspberry pi 5 integrated bluetooth adapter, I only bought a bt500 because it was at microcenter and it was kinda cheap. Turns out the realtek chip inside of the bt500 isn’t natively supported by raspian (or linux really). After debugging for like 3-4 hours, I had a thought that maybe the cana kit fan was blocking the bluetooth signal, so I removed the fan tried the native bluetooth on the raspberry pi and everything worked perfectly.

tl;dr : I spent extra money to give myself more problems

Zapperz

Not sure what this is but I think I might keep rolling and texture it to see where it can fit in to future projects.

Other angles:

The pole out the side is supposed to be the end of a baseball bat, the curve underneath I wanted to be some kind of road tire.The sights are a soup can with a nail poking underneath.

….What is it?

I made this thing:

I don’t know what it is but its neat.

The column+coil is built around taking the animation frame counter, doing some math to make a sine-wave and using that as a blend factor between a few colors. Here’s a better zoom.

Not much more to show of this, the base is too shiny IMO but I was surprised how easy it was to make dynamic textures in blender. The volume aspect I hooked up just to see what it would look like and I dug it. In the future I should learn a bit more about how that plays into things but for now I’m happy with my cheap smoke.

Tree part 3

Recording myself modeling another tree. Then I realized I probably shouldn’t be posting videos with copyrighted music, so I made a song to go along with the tree recording (I used some of the pre-canned abelton clips for bass and vocals which feels like cheating but I wanted something kinda alright and fast, the drums I did myself.)

There’s one branch that looks weird to me but otherwise I think it came out alright.

This also was my first time using KdenLive (https://kdenlive.org/en/) which is surprisingly good for an open source video editor. Last time I was editing video heavily was probably in high school with a trial copy of sony vegas and windows movie maker.

Highly recommend if you want basic timelines + audio editing, if you’re going crazy with effects might be better to stick to adobe’s software.

Now I can go probably make small variations to the same tree 3-4 more times to get a good low-poly forest going.