The red line is a line trace I perform upwards to find any actors that are willing to take the lightbulb:
So I’ll have to essentially make a bone heavy mesh and attach that to the lightbulb.
Also the first level is progressing:
Slow pace but I think I should have something within a week. Going with the desert motif so I don’t have to make too many buildings..
Also I made an alert sound for the bots:
I made the sound in Abelton but I didn’t save the set….Essentially it was an Up only arpeggiator into an operator preset into a redux into a compress and eq-8. Also “arpeggiator” I found out isn’t in the dictionary:
Fasulo, G.; Federico, L.; Sollazzo, A.; De Vivo, L.; Citarella, R. Hybrid Rocket Engine Noise: Measurements and Predictions of Acoustic Environments from Horizontal Static Fire. Appl. Sci.2023, 13, 9041. https://doi.org/10.3390/app13159041
I did this by making some white noise generators in Abelton and filtering everything down using EQ 8:
Bassy NoiseHigher Pitched HissOver Saturated mid region
Not a perfect re-creation (I did not read the paper also didn’t compensate for the A weighting, which I think they would add here?) but close enough.
The laser and rocket tail are just Niagara systems, the laser has an input that is updated every tick from the rocket bot.
I kinda want the bot to aim itself at the player, that way the laser gets shined in their eyes. But maybe next time around? I really gotta work on the base map for v0.0.1( or was it v0.0.2?)
I wanted cars in the map so I did a 5 minute car….yeah…
(Also the hud is still crap and rockets fly through walls….)
Right now I have the detection radius at like 1m so thats why I have to rub against the robot to get it standing up. This required some mods to the AI tree:
You can see that “start Engage” block is now orphaned off the tree. The original blueprint essentially just found the player no matter the situation and copied it into the “target” field on the blackboard:
By removing that I pushed the selection of the target to the pawn itself. Now the general logic is: I SEE A PLAYER! ATTACK! Rather than being provided a player by the engine.
Unreal’s AI Perception is one of those things that are just kinda “done” in the engine and it works the way you expect.
The goal is that I’ll have some kind of assembly line that will dump out “undeployed” robots then they’ll get deployed and try to find you in the map. Also this lets me make lying in wait kinda situations with the bots. Also if I’m aiming for a killing floor style of wave survival I want to have explicit building spawners that play an animation and dump out robots.
I have the patrol bots hooked up to throw them based upon an enumeration (Which I dont like, I’d rather a subclass but I shot myself in the foot earlier on).
I just hijacked the reload animation and basically said “if you’re a grenade bot, reload, and instead of throwing a mag up throw a flash-bang”. Pretty stupid for now but its a good proof of concept, the shakyness is because I half implemented physical animations to handle bot hits. I’m still not happy about that, I want to move to a fully unreal animation setup because my animations are so simplistic but that’s another avenue I gotta go learn.
Wanted to make a 200-ish bmp beat but I got bored and glitched them the hell out.
You should NOT do this for any reason, normally it makes a horrid clippy sound, in this case I compressed and filtered it so it wasn’t that bad.
The synth-y part is an Abelton meld preset with an up arpeggiator at the front.
I might re-make this guy with the same synth but better, less bugged out drums but I kinda wanna make the tree model break apart the way I wanted in game I’m making….I’ll play it by ear based what I’m feeling tomorrow.
Recording myself modeling another tree. Then I realized I probably shouldn’t be posting videos with copyrighted music, so I made a song to go along with the tree recording (I used some of the pre-canned abelton clips for bass and vocals which feels like cheating but I wanted something kinda alright and fast, the drums I did myself.)
There’s one branch that looks weird to me but otherwise I think it came out alright.
This also was my first time using KdenLive (https://kdenlive.org/en/) which is surprisingly good for an open source video editor. Last time I was editing video heavily was probably in high school with a trial copy of sony vegas and windows movie maker.
Highly recommend if you want basic timelines + audio editing, if you’re going crazy with effects might be better to stick to adobe’s software.
Now I can go probably make small variations to the same tree 3-4 more times to get a good low-poly forest going.
My friend is working on some stuff in rust and it’s popped up in my job a few times so I wrote an asio sine wave generator in rust to familiarize myself.
FYI the code below is not debugged/cleaned up and definitely has stuff that can be taken out.
use cpal::{
traits::{DeviceTrait, HostTrait, StreamTrait}, DevicesError, FromSample, OutputCallbackInfo, Sample, SizedSample
};
fn main() {
let host;
#[cfg(target_os = "windows")]
{
host = cpal::host_from_id(cpal::HostId::Asio).expect("failed to initialise ASIO host");
let devices = host.output_devices().expect("No devices Found");
println!("Searching for ASIO Devices...");
let mut counter = 0;
for device_from_list in devices
{
counter = counter +1;
let name: String = device_from_list.name().expect("YOU LITERALLY JUST GAVE ME THIS, IF YOU FAILED THE PROGRAM SHOULD HAVE STOPPED WHY IS THIS ALLOWED");
println!("{counter} : {name}");
}
let searchString = "Focusrite USB ASIO";
let device: Option<cpal::Device> = host.output_devices().expect("we already did this!").find(|x| x.name().map(|y| y == searchString).unwrap_or(false));
let device_ptr: &cpal::Device = device.as_ref().unwrap();
let devicePrintName: String = device_ptr.name().expect("How would this not have a name at this point.");
println!("Connected To {devicePrintName}");
let config_ptr:&cpal::SupportedStreamConfig = &device_ptr.default_output_config().unwrap();
println!("Default output config: {:?}", config_ptr);
//These can by dyanmic but I'm keeping them hardcoded to my fosurite settings.
let mut audioOut = [0_f64; 1024];
let mut currentTime: f64 = 0.0;
let mut freqOut : f64 = 1000.0;
let mut ampOut : f64 = 0.5;
let sampleRate : f64 = 192000.0;
let freqAsRad: f64 = 2.0*3.14*freqOut;
for (sampleIdx,sampleVal) in audioOut.iter_mut().enumerate() {
currentTime = currentTime + 1.0/sampleRate;
*sampleVal = ampOut * (freqAsRad*currentTime).sin();
}
let err_fn = |err| eprintln!("an error occurred on stream: {}", err);
let channels = config_ptr.channels() as usize;
let myStreamConfig = cpal::StreamConfig{
channels : 2,
sample_rate : cpal::SampleRate(192000),
buffer_size : cpal::BufferSize::Fixed(1024),
};
let stream = device_ptr.build_output_stream(&myStreamConfig,
getData2,err_fn,None).expect("Well I'm stummped");
stream.play();
std::thread::sleep(std::time::Duration::from_millis(1000));
}
fn getData2(output: &mut [i32], callbackInfo : &OutputCallbackInfo )
{
static mut currentTime: f64 = 0.0; //uhhgg
unsafe{
let mut freqOut : f64 = 440.0;
let mut ampOut : f64 = std::i32::MAX as f64 / 32.0;
let sampleRate : f64 = 192000.0;
let freqAsRad: f64 = 2.0*3.14*freqOut;
for frame in output.chunks_mut(100) {
for sample in frame.iter_mut() {
currentTime = currentTime + 1.0/sampleRate;
*sample = (ampOut * (freqAsRad*currentTime).sin()).trunc() as i32;
}
}
}
}
}
Anyone who knows rust probably wont like the way I wrote this, but it does what I set out to do. Which brings me to the stuff I don’t really like about rust on first impressions:
1.) It expects you’re going to develop multi-threaded applications: this is totally reasonable as that is the primary draw of using rust. However if you’re trying to pump out an easy single threaded connect + execute style of device control it can be quite frustrating trying to understand how an API functions.
2.) A lot of error handling: I’m cool with this concept but this is probably going to end up with me putting .except() or ? at the end of majority of my function calls.
3.) Optional static typing: this has always been a pet peeve of mine
The plus sides
1.) Cargo is great
2.) Forcing error management is probably a good practice
3.) Explicitly calling out mutability is a positive in my head
4.) Unsafe blocks are smart (because the primary goal is to make everything thread safe)
If you want this code to work you gotta use cargo to clone down cpal and follow the instructions here: https://github.com/RustAudio/cpal
Its cold and I’m unhappy so you can join my in this cacophony of GENERATIVE AUDIO! (Please turn your speakers down, this is the loudest thing I’ve put up and I don’t wanna re-upload)
“How did you achieve this musically inclined symphony??” I’m glad you have asked this:
1.) Operator
2.) A Clipping and distorted 808
3.) A sub sound (Which is also operator I guess)
4.) A hit of that SWEET SWEET Compression
(also an LFO to play with the sub sound)
Now I’m going to scream into a microphone about my commute and complain about needing to drink more water.
I also found an old kindle (like a HELLA old kindle 1) and tried modeling it in substance and blender. Spoiler: it looks HORRID
I thought I could be lazy and not model buttons and just throw them in a bump map. But the geometry lords laughed in my face and made everything terrible (also I spent way too long making this color map in gimp):
But you can argue that the po-20 and the kaoss pad aren’t much better but at least it required a little bit of technique (not pictured is my bass which I played three notes on. Also ignore the pc speakers). Should have I properly mixed at the end? Sure, but why do that when you can compress and forget.