I have the patrol bots hooked up to throw them based upon an enumeration (Which I dont like, I’d rather a subclass but I shot myself in the foot earlier on).
I just hijacked the reload animation and basically said “if you’re a grenade bot, reload, and instead of throwing a mag up throw a flash-bang”. Pretty stupid for now but its a good proof of concept, the shakyness is because I half implemented physical animations to handle bot hits. I’m still not happy about that, I want to move to a fully unreal animation setup because my animations are so simplistic but that’s another avenue I gotta go learn.
Wanted to make a 200-ish bmp beat but I got bored and glitched them the hell out.
You should NOT do this for any reason, normally it makes a horrid clippy sound, in this case I compressed and filtered it so it wasn’t that bad.
The synth-y part is an Abelton meld preset with an up arpeggiator at the front.
I might re-make this guy with the same synth but better, less bugged out drums but I kinda wanna make the tree model break apart the way I wanted in game I’m making….I’ll play it by ear based what I’m feeling tomorrow.
Recording myself modeling another tree. Then I realized I probably shouldn’t be posting videos with copyrighted music, so I made a song to go along with the tree recording (I used some of the pre-canned abelton clips for bass and vocals which feels like cheating but I wanted something kinda alright and fast, the drums I did myself.)
There’s one branch that looks weird to me but otherwise I think it came out alright.
This also was my first time using KdenLive (https://kdenlive.org/en/) which is surprisingly good for an open source video editor. Last time I was editing video heavily was probably in high school with a trial copy of sony vegas and windows movie maker.
Highly recommend if you want basic timelines + audio editing, if you’re going crazy with effects might be better to stick to adobe’s software.
Now I can go probably make small variations to the same tree 3-4 more times to get a good low-poly forest going.
My friend is working on some stuff in rust and it’s popped up in my job a few times so I wrote an asio sine wave generator in rust to familiarize myself.
FYI the code below is not debugged/cleaned up and definitely has stuff that can be taken out.
use cpal::{
traits::{DeviceTrait, HostTrait, StreamTrait}, DevicesError, FromSample, OutputCallbackInfo, Sample, SizedSample
};
fn main() {
let host;
#[cfg(target_os = "windows")]
{
host = cpal::host_from_id(cpal::HostId::Asio).expect("failed to initialise ASIO host");
let devices = host.output_devices().expect("No devices Found");
println!("Searching for ASIO Devices...");
let mut counter = 0;
for device_from_list in devices
{
counter = counter +1;
let name: String = device_from_list.name().expect("YOU LITERALLY JUST GAVE ME THIS, IF YOU FAILED THE PROGRAM SHOULD HAVE STOPPED WHY IS THIS ALLOWED");
println!("{counter} : {name}");
}
let searchString = "Focusrite USB ASIO";
let device: Option<cpal::Device> = host.output_devices().expect("we already did this!").find(|x| x.name().map(|y| y == searchString).unwrap_or(false));
let device_ptr: &cpal::Device = device.as_ref().unwrap();
let devicePrintName: String = device_ptr.name().expect("How would this not have a name at this point.");
println!("Connected To {devicePrintName}");
let config_ptr:&cpal::SupportedStreamConfig = &device_ptr.default_output_config().unwrap();
println!("Default output config: {:?}", config_ptr);
//These can by dyanmic but I'm keeping them hardcoded to my fosurite settings.
let mut audioOut = [0_f64; 1024];
let mut currentTime: f64 = 0.0;
let mut freqOut : f64 = 1000.0;
let mut ampOut : f64 = 0.5;
let sampleRate : f64 = 192000.0;
let freqAsRad: f64 = 2.0*3.14*freqOut;
for (sampleIdx,sampleVal) in audioOut.iter_mut().enumerate() {
currentTime = currentTime + 1.0/sampleRate;
*sampleVal = ampOut * (freqAsRad*currentTime).sin();
}
let err_fn = |err| eprintln!("an error occurred on stream: {}", err);
let channels = config_ptr.channels() as usize;
let myStreamConfig = cpal::StreamConfig{
channels : 2,
sample_rate : cpal::SampleRate(192000),
buffer_size : cpal::BufferSize::Fixed(1024),
};
let stream = device_ptr.build_output_stream(&myStreamConfig,
getData2,err_fn,None).expect("Well I'm stummped");
stream.play();
std::thread::sleep(std::time::Duration::from_millis(1000));
}
fn getData2(output: &mut [i32], callbackInfo : &OutputCallbackInfo )
{
static mut currentTime: f64 = 0.0; //uhhgg
unsafe{
let mut freqOut : f64 = 440.0;
let mut ampOut : f64 = std::i32::MAX as f64 / 32.0;
let sampleRate : f64 = 192000.0;
let freqAsRad: f64 = 2.0*3.14*freqOut;
for frame in output.chunks_mut(100) {
for sample in frame.iter_mut() {
currentTime = currentTime + 1.0/sampleRate;
*sample = (ampOut * (freqAsRad*currentTime).sin()).trunc() as i32;
}
}
}
}
}
Anyone who knows rust probably wont like the way I wrote this, but it does what I set out to do. Which brings me to the stuff I don’t really like about rust on first impressions:
1.) It expects you’re going to develop multi-threaded applications: this is totally reasonable as that is the primary draw of using rust. However if you’re trying to pump out an easy single threaded connect + execute style of device control it can be quite frustrating trying to understand how an API functions.
2.) A lot of error handling: I’m cool with this concept but this is probably going to end up with me putting .except() or ? at the end of majority of my function calls.
3.) Optional static typing: this has always been a pet peeve of mine
The plus sides
1.) Cargo is great
2.) Forcing error management is probably a good practice
3.) Explicitly calling out mutability is a positive in my head
4.) Unsafe blocks are smart (because the primary goal is to make everything thread safe)
If you want this code to work you gotta use cargo to clone down cpal and follow the instructions here: https://github.com/RustAudio/cpal
Its cold and I’m unhappy so you can join my in this cacophony of GENERATIVE AUDIO! (Please turn your speakers down, this is the loudest thing I’ve put up and I don’t wanna re-upload)
“How did you achieve this musically inclined symphony??” I’m glad you have asked this:
1.) Operator
2.) A Clipping and distorted 808
3.) A sub sound (Which is also operator I guess)
4.) A hit of that SWEET SWEET Compression
(also an LFO to play with the sub sound)
Now I’m going to scream into a microphone about my commute and complain about needing to drink more water.
I also found an old kindle (like a HELLA old kindle 1) and tried modeling it in substance and blender. Spoiler: it looks HORRID
I thought I could be lazy and not model buttons and just throw them in a bump map. But the geometry lords laughed in my face and made everything terrible (also I spent way too long making this color map in gimp):
But you can argue that the po-20 and the kaoss pad aren’t much better but at least it required a little bit of technique (not pictured is my bass which I played three notes on. Also ignore the pc speakers). Should have I properly mixed at the end? Sure, but why do that when you can compress and forget.
The bass sounds alright but it still is midi-ified because I opted to use a sampled bass rather than being less lazy to use my bass. I’m also moving down the path of using the arrangement view to handle making these loops from now on. It forces me to think of anything I make as a whole piece rather than a collection of loop states.
All work and blank scenes makes will indifferent
I also finished the blade itself (book one of this series https://en.wikipedia.org/wiki/The_First_Law) which I enjoyed (also it was pretty much the only book I read in 2024) but it ended kind of abruptly. Felt as if the whole series was written at once and then arbitrarily chopped. I found out today that one of the side books is getting made into a movie so maybe I’ll try to read that before it’s made.
I dig the vibes from this but there’s a few big issues:
The bass sounds midi as hell, like doom 1 level of computerization
I had trouble mixing the drums, I dug how robotic those sounded but it was hard balancing it with the lead synth
The Lead synth is cool as hell but really I would’ve rather a real distorted as hell guitar
Maybe I’ll go grab a crappy guitar and re-record this with real stuff to see if the problem is the lack of humanization or if it’s just a poorly thought out song in general. My bet is that once I try playing it, I’ll see some weirdness that will change the underlying song to be more of a real thing.