Using ML to make a well known and established DSP techniques harder (Part 2)

I got some feedback from my last post which basically said:

1.) “Why didn’t you test against different signals? “

2.) “The signal I chose was too match against was pretty simple, you should do a continuous function”

3.) Why wouldn’t you make a subfunction to detect something like a pulse (containing something like a comms symbol) then apply it to a large context?

Valid points…

Testing different signals

Went through and updated the signal simulator to take in more than one type of signal.

The results aren’t great:

(Again green is detected and red is not detected).

So this that tells me that the AI might be trained around the absence of pure noise rather than the detection of the signal! To fix this I expanded training to take in the signal types I made here (guassian, sine wave, sawtooth wave)/

I started with 20000 varying SNR signals for each type and the results seem good enough:

The edge cases seem to be the weirdest portion where each type seems to be marked as successful? To fix this I added a limit to the shift amount for detection to check just to see if I could get it to be seemingly perfect.

I think the issue is directly related to #2, where I’m not using a continuous function for detection so I’m essentially making the two “non-signals” close to looking like two guassian functions so the neural net detects them as positive.

Continuous functions

To enable this I need to re-write a bunch of the code. I also need to re-work how I’m identifying an “ideal” signal. I’ll jump back on this next week and show off the results, my hope is to essentially feed time data into the neural net and hopefully the neural net can determine frequency content as needed??

More to figure out next week.

Using ML to make a well known and established DSP techniques harder

I decided to bite the bullet and startup learning pytorch(https://pytorch.org/) and basic neural net creation (Also this was a reason I dropped $2000ish on a 4090 a year ago and with the promise I could use it to do things like this). After the basic letter recognition tutorial (here) I decided I wanted to make a 1-d signal recognition tutorial.

Here’s the plan:

  • Make a simple signal
  • Make a simple neural network
  • Train an AI to detect the signal in noise
  • Test the neural net with data (Ideally in real time)

From this I can surmise I would need a few python scripts for generating test data, training the model, defining the model and some kind of simulator to test out the model. Ideally with a plot running in real time for a cool video.

Quick aside: If you see “Tensor” it isn’t a “Tensor (https://en.wikipedia.org/wiki/Tensor)” It seems more like a grammar replacement for a multi-dimensional matrix? To my knowledge there’s no size or value checks for linearity when you convert a numpy array to a tensor in pytorch so I’m guessing there’s no guarantee of linearity? (or maybe its because every element is a real constant I pass the checks every time?)

Make a simple signal

The signal I chose as my test signal was two Gaussian curves across 768 samples. Making this with numpy (which is much, much worse than matlab by the way, why isn’t there an rms function??? https://github.com/numpy/numpy/issues/15940 ).
Essentially its 10ish lines of code

def ideal_signal():
    return_size = 784
    x = np.arange(return_size)
    # Gaussian parameters
    mu1, sigma1 = 196, 10   # center=100, std=10
    mu2, sigma2 = 588, 20   # center=200, std=20
    # Create two Gaussian curves
    gauss1 = np.exp(-0.5 * ((x - mu1) / sigma1) ** 2)
    gauss2 = np.exp(-0.5 * ((x - mu2) / sigma2) ** 2)
    # Combine them into one vector
    vector = gauss1 + gauss2
    return vector/vector.max()

Easy Enough now onto harder things

Make a simple neural network

So making the network is obviously the most tricky part here. You can easily mess up if you don’t understand the I/O of the network. Also choosing the network complexity seems to currently be some kind of hidden magic that is wielded by PHDs. In my case I had the following working for me:

  • My Input vector size will ALWAYS be 768 (No need to detect/truncate any features)
  • My Output Vector will ALWAYS be a binary (either a yes or no whether or not the signal is present)

So this makes my life a bit easier in selection. The first layer of the neural network will be sized of 768, then a hidden layer of 128 then an output of 2. Why 128 a hidden layer of 128? I have no idea, this is where I’m lacking knowledge wise, my original though was that the “features” that I was looking for would fit into a 128 sample window but as I’ve progressed I realized that is a poor assumption to choose your hidden layer. My guess is that I’m doing too much (math/processing wise) but this assumption that I made upfront seems to have worked for me.

Train an AI to detect the signal in noise

So this is the bulk of the work here. The idea was to generate a bunch of signals that can be used to represent the span of real world signals a detector such as this would see. Therefore you have pretty much three parameters to mess with, signal delay (or number of samples to shift), signal SNR, and system noise floor. Technically you do not need system noise floor, I have it in there as a parameter anyways but I wanted to facilitate customization of any scripts I made for later. In my case I kept system noise floor constant at -30dB.

Essentially the pseudo code for generating signals is

1.) Take the ideal signal made above

2.) Apply a sample delay (positive or negative) to a limit so we don’t loose the two Gaussian peaks

3.) Apply noise at a random SNR within a bounded limit (in my case it was -10dB to 10dB)

4.) Threshold the signal to be a positive when the SNR I applied to the noise is above where I expect a detection.

The size of the training data, again I had no idea what I was doing here so I just guess 50000 signals to shove into the neural net? In reality this was a guess and check process to see if I under-trained/over-trained as I tested the model (I’m writing this after getting a working model which is uncommon for most of these posts). Also the training quality is also arbitrary here, I just kept running #’s until things worked: Kind of lame I admit but I have a 4090 and training takes 15 seconds so I have that luxury here.

Test the neural net with data (Ideally in real time)

To build out the simulator I wanted I basically took a bunch of the generation code from the training data and threw it into a common functions wrapper for re-use. My end artifact was to make a simple pyqt (https://wiki.python.org/moin/PyQt) app that had buttons for starting and starting a simulation and sliders for sample offset and for signal SNR. Then I would make an indicator if the neural net detected the signal or not. The only difficulty with this is mostly just dealing with async programming (which is a difficulty with all AI). My solution here was to have the main thread run the QT gui, then spawn a background thread to do AI work. The main thread would then generate the data and do a thread safe send (using pyqtsignal) to the background thread with the data after plotting. The background thread then processes the incoming data using the GPU and will send a positive or negative signal back to the main app to change the line color green for detected and red for not detected.

Results

(Top slider is sample offset, bottom slider is SNR)

The plot above is locked to 20 fps and all of the calls to the neural net on the gpu return well before the frame is finished drawing. Its pretty surprising this worked out as well as it did. However there are definitely issues with hard coupling of the signal being centered. I honestly still do NOT thing this is anywhere near what you would want for any critical systems (Black boxing a bunch of math in the middle seems like a bad idea). However for a quick analysis tools I can see this being useful if packaged in a manner that was for non-dsp knowledgeable engineers.

Other Notes Issues/Things I Skipped

I did many more iterations I didn’t write about here. I had issues with model sizing, training data types, implementing a dataset compatible with pytorch, wierd plotting artifacts etc.

Future work I want to do in this space:

  • Get better at understanding neural net sizing, I feel like I went arbitrary here which I’m not a fan of
  • Try to make the neural net more confident in where there is NO signal. The neural net dosen’t return a pure binary signal, it returns a probability of no signal or a probability of signal. When you see a red signal its really checking if the “Yes” probability is greater than the “no” probability AND if the “Yes” probability is greater than 60%. Ideally you would see values such as [0.001 0.99] when there is a signal present and [0.99 0.001] when there’s no signal. However, the “NO” signal probability seems to hover at 40-50% constantly and when there’s low SNR the probabilities are both around 50% which is pretty much useless here.

Comparison to matched filtering

My Neural NetFull Convolution With Matched Filter
Multiplies98560 589824
Additions98430 589056

So on paper I guess this is “Technically” less work on the PC than a perfected matched filter response (i.e. auto-correlation). However, I think the problem here is that using a neural net to do something this simple is probably much more processing intensive than just making an filter that pulls out the content you want. But that being said if you had a high pressure situation and needed to do simple signal detection and just happened to have a free NPU in your system this could work.

The number of TOPs (see https://www.qualcomm.com/news/onq/2024/04/a-guide-to-ai-tops-and-npu-performance-metrics) that my neural net uses is ≈ 98560/10^12 = 9.856e-8 TOPs which is INCREDIBLY SMALL for most NPUs so most likeley I could operate this in any real time configuration (even on the cheapest possible qualcomm npu here: https://en.wikipedia.org/wiki/Qualcomm_Hexagon which as around 3 TOPS, to give a better comparison my 4090 runs 1300ish TOPS).

Next Steps

I want to still get better at building these so I think my next step in will be more tuned towards analysis of several signals, then trying to combine them and bin them into each category (pretty much like the pytorch tutorial but more deliberate). Also get the code on github… Code is on github here: https://github.com/wfkolb/ml_signal_detector/tree/main

“Stay a while and Listen”

I added some visitor analytics to wordpress via a custom plugin

As a non-web programmer this was surprisingly easy…Only because I have AI.

Claude AI was able to throw everything into a script and threw it into a zip file and installed it into wordpress. A apart of me feels like I cheated but at the same time I don’t want to spend hours learning the wordpress database system, plugin api, and chart js just to see a few numbers and a line plot.

I’ve attached the code below that you can package into a .zip to upload to any wordpress site below.

Expand for plugin php code.
<?php
/**
 * Plugin Name: Visitor Counter Dashboard
 * Description: Displays visitor statistics in a graph on the admin dashboard
 * Version: 1.3.0
 * Author: Claude Ai (via Will Kolb)
 */

// Prevent direct access
if (!defined('ABSPATH')) {
    exit;
}

class VisitorAnalytics {
    
    private $table_name;
    
    public function __construct() {
        global $wpdb;
        $this->table_name = $wpdb->prefix . 'visitor_analytics';
        
        // Hook into WordPress
        add_action('init', array($this, 'track_visitor'));
        add_action('wp_dashboard_setup', array($this, 'add_dashboard_widget'));
        add_action('admin_enqueue_scripts', array($this, 'enqueue_admin_scripts'));
        add_action('wp_ajax_get_visitor_data', array($this, 'ajax_get_visitor_data'));
        
        // Activation hook
        register_activation_hook(__FILE__, array($this, 'create_table'));
    }
    
    /**
     * Create database table on plugin activation
     */
    public function create_table() {
        global $wpdb;
        
        $charset_collate = $wpdb->get_charset_collate();
        
        $sql = "CREATE TABLE {$this->table_name} (
            id mediumint(9) NOT NULL AUTO_INCREMENT,
            visit_date date NOT NULL,
            visit_count int(11) NOT NULL DEFAULT 1,
            ip_address varchar(45) NOT NULL,
            user_agent text,
            page_url varchar(255),
            created_at datetime DEFAULT CURRENT_TIMESTAMP,
            PRIMARY KEY (id),
            UNIQUE KEY unique_daily_ip (visit_date, ip_address)
        ) $charset_collate;";
        
        require_once(ABSPATH . 'wp-admin/includes/upgrade.php');
        dbDelta($sql);
    }
    
    /**
     * Track visitor on each page load
     */
    public function track_visitor() {
        // Don't track admin users or admin pages
        if (is_admin() || current_user_can('manage_options')) {
            return;
        }
        
        global $wpdb;
        
        $ip_address = $this->get_user_ip();
        $today = current_time('Y-m-d');
        $user_agent = sanitize_text_field($_SERVER['HTTP_USER_AGENT'] ?? '');
        $page_url = sanitize_text_field($_SERVER['REQUEST_URI'] ?? '');
        
        // Check if this IP has already been recorded today
        $existing = $wpdb->get_row($wpdb->prepare(
            "SELECT id FROM {$this->table_name} WHERE visit_date = %s AND ip_address = %s",
            $today, $ip_address
        ));
        
        if (!$existing) {
            // Insert new visitor record
            $wpdb->insert(
                $this->table_name,
                array(
                    'visit_date' => $today,
                    'ip_address' => $ip_address,
                    'user_agent' => $user_agent,
                    'page_url' => $page_url,
                    'visit_count' => 1
                ),
                array('%s', '%s', '%s', '%s', '%d')
            );
        }
    }
    
    /**
     * Get user's IP address
     */
    private function get_user_ip() {
        $ip_keys = array('HTTP_CF_CONNECTING_IP', 'HTTP_CLIENT_IP', 'HTTP_X_FORWARDED_FOR', 'REMOTE_ADDR');
        
        foreach ($ip_keys as $key) {
            if (array_key_exists($key, $_SERVER) === true) {
                $ip = sanitize_text_field($_SERVER[$key]);
                if (filter_var($ip, FILTER_VALIDATE_IP, FILTER_FLAG_NO_PRIV_RANGE | FILTER_FLAG_NO_RES_RANGE)) {
                    return $ip;
                }
            }
        }
        
        return sanitize_text_field($_SERVER['REMOTE_ADDR'] ?? '127.0.0.1');
    }
    
    /**
     * Add dashboard widget
     */
    public function add_dashboard_widget() {
        wp_add_dashboard_widget(
            'visitor_analytics_widget',
            'Visitor Analytics',
            array($this, 'display_dashboard_widget')
        );
    }
    
    /**
     * Display the dashboard widget content
     */
    public function display_dashboard_widget() {
        ?>
        <div id="visitor-analytics-container">
            <div style="margin-bottom: 15px;">
                <select id="analytics-period" style="margin-right: 10px;">
                    <option value="7">Last 7 days</option>
                    <option value="30">Last 30 days</option>
                    <option value="90">Last 90 days</option>
                </select>
                <button id="refresh-analytics" class="button button-secondary">Refresh</button>
            </div>
            
            <div id="analytics-summary" style="display: flex; gap: 20px; margin-bottom: 20px;">
                <div style="text-align: center;">
                    <h3 style="margin: 0; color: #23282d;">Today</h3>
                    <p style="font-size: 24px; font-weight: bold; margin: 5px 0; color: #0073aa;" id="today-visitors">-</p>
                </div>
                <div style="text-align: center;">
                    <h3 style="margin: 0; color: #23282d;">This Week</h3>
                    <p style="font-size: 24px; font-weight: bold; margin: 5px 0; color: #00a32a;" id="week-visitors">-</p>
                </div>
                <div style="text-align: center;">
                    <h3 style="margin: 0; color: #23282d;">This Month</h3>
                    <p style="font-size: 24px; font-weight: bold; margin: 5px 0; color: #d63638;" id="month-visitors">-</p>
                </div>
            </div>
            
            <canvas id="visitor-chart" width="400" height="200" style="max-height: 200px;"></canvas>
            <div id="analytics-loading" style="text-align: center; padding: 20px;">Loading...</div>
        </div>
        
        <script>
        jQuery(document).ready(function($) {
            let chart = null;
            
            // Cleanup function to properly destroy chart
            function destroyChart() {
                if (chart && typeof chart.destroy === 'function') {
                    chart.destroy();
                    chart = null;
                }
            }
            
            // Clean up when leaving the page
            $(window).on('beforeunload', destroyChart);
            
            function loadAnalytics() {
                const period = $('#analytics-period').val();
                $('#analytics-loading').show();
                
                $.ajax({
                    url: ajaxurl,
                    type: 'POST',
                    data: {
                        action: 'get_visitor_data',
                        period: period,
                        nonce: '<?php echo wp_create_nonce('visitor_analytics_nonce'); ?>'
                    },
                    success: function(response) {
                        if (response.success) {
                            updateSummary(response.data.summary);
                            updateChart(response.data.chart_data);
                        }
                        $('#analytics-loading').hide();
                    },
                    error: function() {
                        $('#analytics-loading').hide();
                        alert('Error loading analytics data');
                    }
                });
            }
            
            function updateSummary(summary) {
                $('#today-visitors').text(summary.today || 0);
                $('#week-visitors').text(summary.week || 0);
                $('#month-visitors').text(summary.month || 0);
            }
            
            function updateChart(chartData) {
                const canvas = document.getElementById('visitor-chart');
                const ctx = canvas.getContext('2d');
                
                // Properly destroy existing chart
                if (chart) {
                    chart.destroy();
                    chart = null;
                }
                
                // Clear the canvas
                ctx.clearRect(0, 0, canvas.width, canvas.height);
                
                // Reset canvas size
                canvas.style.width = '100%';
                canvas.style.height = '200px';
                
                chart = new Chart(ctx, {
                    type: 'line',
                    data: {
                        labels: chartData.labels,
                        datasets: [{
                            label: 'Daily Visitors',
                            data: chartData.data,
                            borderColor: '#0073aa',
                            backgroundColor: 'rgba(0, 115, 170, 0.1)',
                            borderWidth: 2,
                            fill: true,
                            tension: 0.4
                        }]
                    },
                    options: {
                        responsive: true,
                        maintainAspectRatio: false,
                        interaction: {
                            intersect: false,
                            mode: 'index'
                        },
                        scales: {
                            y: {
                                beginAtZero: true,
                                ticks: {
                                    stepSize: 1,
                                    precision: 0
                                }
                            },
                            x: {
                                grid: {
                                    display: false
                                }
                            }
                        },
                        plugins: {
                            legend: {
                                display: false
                            },
                            tooltip: {
                                backgroundColor: 'rgba(0, 0, 0, 0.8)',
                                titleColor: '#fff',
                                bodyColor: '#fff',
                                cornerRadius: 4
                            }
                        },
                        elements: {
                            point: {
                                radius: 3,
                                hoverRadius: 6
                            }
                        }
                    }
                });
            }
            
            // Event listeners
            $('#analytics-period').change(loadAnalytics);
            $('#refresh-analytics').click(loadAnalytics);
            
            // Initial load
            loadAnalytics();
        });
        </script>
        <?php
    }
    
    /**
     * Enqueue admin scripts
     */
    public function enqueue_admin_scripts($hook) {
        if ($hook === 'index.php') {
            wp_enqueue_script('chart-js', 'https://cdnjs.cloudflare.com/ajax/libs/Chart.js/3.9.1/chart.min.js', array(), '3.9.1', true);
        }
    }
    
    /**
     * AJAX handler for getting visitor data
     */
    public function ajax_get_visitor_data() {
        // Verify nonce
        if (!wp_verify_nonce($_POST['nonce'], 'visitor_analytics_nonce')) {
            wp_die('Security check failed');
        }
        
        // Check user permissions
        if (!current_user_can('manage_options')) {
            wp_die('Insufficient permissions');
        }
        
        global $wpdb;
        $period = intval($_POST['period']);
        
        // Get summary data
        $today = current_time('Y-m-d');
        $week_ago = date('Y-m-d', strtotime('-7 days', current_time('timestamp')));
        $month_ago = date('Y-m-d', strtotime('-30 days', current_time('timestamp')));
        
        $today_count = $wpdb->get_var($wpdb->prepare(
            "SELECT COUNT(*) FROM {$this->table_name} WHERE visit_date = %s", $today
        ));
        
        $week_count = $wpdb->get_var($wpdb->prepare(
            "SELECT COUNT(*) FROM {$this->table_name} WHERE visit_date >= %s", $week_ago
        ));
        
        $month_count = $wpdb->get_var($wpdb->prepare(
            "SELECT COUNT(*) FROM {$this->table_name} WHERE visit_date >= %s", $month_ago
        ));
        
        // Get chart data
        $start_date = date('Y-m-d', strtotime("-{$period} days", current_time('timestamp')));
        
        $chart_data = $wpdb->get_results($wpdb->prepare(
            "SELECT visit_date, COUNT(*) as visitor_count 
             FROM {$this->table_name} 
             WHERE visit_date >= %s 
             GROUP BY visit_date 
             ORDER BY visit_date ASC",
            $start_date
        ));
        
        // Fill in missing dates with zero counts
        $labels = array();
        $data = array();
        
        for ($i = $period - 1; $i >= 0; $i--) {
            $date = date('Y-m-d', strtotime("-{$i} days", current_time('timestamp')));
            $labels[] = date('M j', strtotime($date));
            
            $count = 0;
            foreach ($chart_data as $row) {
                if ($row->visit_date === $date) {
                    $count = intval($row->visitor_count);
                    break;
                }
            }
            $data[] = $count;
        }
        
        wp_send_json_success(array(
            'summary' => array(
                'today' => intval($today_count),
                'week' => intval($week_count),
                'month' => intval($month_count)
            ),
            'chart_data' => array(
                'labels' => $labels,
                'data' => $data
            )
        ));
    }
}

// Initialize the plugin
new VisitorAnalytics();

In gamedev news I want to make a “boss” area thing that the player has to destroy

I think I want to incorporate that helix thingy as the “power core” (see: this post).

I’ve made a few iterations but they keep looking like the Red power ranger’s vape pen:

My hope was to make an equivalent to the Doom 2016 gore nest or the half life 2 energy pylon things, so that the player would essentially break the core causing some gameplay rush event where you gotta kill a bunch of bots and run.

From https://half-life.fandom.com/wiki/Combine_Power_Generator

I might go with some kind of oppressive robotic face instead? Like the Mussolini face from WWII.

from https://en.wikipedia.org/wiki/Palazzo_Braschi

More Cyber

ChatGPT is my friend.

This is in a branch:
https://github.com/wfkolb/CyberpunkRedCyberspace/tree/wk_feature/NewPlayerVisualization

Isn’t hooked up to firebase yet (to pull the floor information) but I figure I can make a few models to handle each level and a bunch for each of the ice daemons (you can see the dog for the hellhound: https://cyberpunk.fandom.com/wiki/Hellhound_(RED) ) which I’ll probably go through the list and make models for each. Outside of wireframe the model is…rough:

Vibin’ for some cyberpunk

Did a bit more work on the cyberpunk stuff

Functionally I think I’m close. I’m probably not going to add branching paths to the floors but I’d rather get this locked down then I can transition the viewing to a better 3d model view (with elevator and all).

Code is posted here:

https://github.com/wfkolb/CyberpunkRedCyberspace

Again this is mostly chatgpt but I got into the point where you can’t really expect chatgpt to do 100% of the work, but it is still like asking someone else for help. ChatGPT can’t read your brain and they can’t know 100% of what you’re seeing and what’s on your computer so you need to modularize as much as you can. In this case I setup the website to be essentially a few javascript classes that all dump a <div> that I add to the baseline html page. The html page has a connection to firebase (https://firebase.google.com/ which has gotten MUCH bigger since I last used it in 2014, ironically I had a “senior app developer” tell me using firebase was “unprofessional” now it seems like its the backend to a bunch of professional tools). The Main page handles the pull/push from the server and I have two “room display” functions that display the rooms differently based upon if you’re a gm or a player.

I have my todo list here: https://github.com/users/wfkolb/projects/2

I really need to finish the starting environment for blacklace but I had a two week gap that I wanted to setup a cyberpunk Red game for…so I’m here.

More Vibe coding

I wanted to setup a way to visualize the cyberspace for cyberpunk red ( https://www.cbr.com/cyberpunk-red-netrunner-class-explained/ ) for awhile and I never found a good solution. So I tried spinning up my own version with some chatgpt help:

Looks like crap but if I get a simple viewer working it should be easy enough to get everything in threes.js to make a better visualization for personal games. I’m hoping to get a retro terminal vibe going with this one kinda like my other visualization here: https://willkolb.com/?p=566

Essentially all this does right now is update a firebase database with some information that the person in the “GM” url can edit and the person in the “player” URL cannot edit. So nothing groundbreaking but I’m starting to find the weak spots with pure AI programming that was kind of annoying. The big one is the context saving/switching for conversations isn’t great, but I think that’s also driven because I’m using chatgpt free tier.

Also because this is just an html file and I’m the only person contributing I can just throw it into github! (Unreal projects are too big for free github).

https://github.com/wfkolb/CyberpunkRedCyberspace

Touch ups

I spent the day listening to old gorillaz youtube videos while working. Once I hopped on abelton I tried re-creating “welcome to the world of the plastic beach” by memory:

It turned out better than I thought after I compared it to the original. (You may notice the beeps from rhinestone Eyes I somehow got mixed up in my head and added to this song).

The hardest part here was the voccoder which I rarley play with and is kinda annoying to get right. So for my future self here’s the voccoder settings:

Here’s the operator that’s driving it

And here’s what’s being played:

Sounds pretty good, I think I could touch up the input noises more and if I got a better brass sound I think it would work better.

Otherwise I spent some time “vibe coding” to touch up the site. Vibe coding is just asking chatgpt to do stuff for you. I can still see why you would need an engineer to do this so I wont say its totally a replacement but damn did it take a 5 hour task and make it like 20 minutes.

(You can only see the above if you’re using a PC on the website) Both the top section and the bottom are custom shortcodes added into the site using by asking chatgpt a bunch of stuff. Here’s the PHP for the category stuff: This also requires a plugin, chatgpt suggested one but I found an older plugin that was open source and asked chatgpt to change the initial code that it gave me to that. It’s pretty crazy how simply that quick reconfiguration was.

//////
//Category Icon List
/////
function shortcode_category_icons() {
    if ( ! function_exists('get_term_icon_url') ) {
        return '<p><em>Icon Categories plugin not active or function missing.</em></p>';
    }

    $output = '<ul class="category-icons">';
    $categories = get_categories([
        'hide_empty' => false, // Show all categories, including empty ones
    ]);

    foreach ( $categories as $category ) {
        $icon_url = get_term_icon_url( $category->term_id, 'category' ); // get the icon URL

        if ( $icon_url ) {
            $output .= '<li>
                <a href="' . esc_url( get_category_link( $category->term_id ) ) . '" title="' . esc_attr( $category->name ) . '">
                    <img src="' . esc_url( $icon_url ) . '" alt="' . esc_attr( $category->name ) . '" class="category-icon-img"/>
                </a>
            </li>';
        }
    }

    $output .= '</ul>';
    return $output;
}
add_shortcode( 'category_icons', 'shortcode_category_icons' );

Also under the title page

That little line above I spent atleast 3 hours trying to get working before I started embracing AI. Chatgpt understood the problem and gave me a solution in seconds. Essentially it was adding a <div> to the “header.php”

Then some java script using another plugin that chatgpt suggested using “WP Headers And Footers” It was so easy that I imagine that it probably isn’t worth blogging too much about. But I see myself getting more into the ai dev side as I go along.

UE Control rigs and Custom Numpad setup

I kept looking at the last video I posted and got really annoyed at how bad the animations of the patrol bot looked. This drove me to remake the animation rig for the patrol bot (to be much, much simpler) and I started making a control rig in unreal to push the animations there (see https://dev.epicgames.com/documentation/en-us/unreal-engine/how-to-create-control-rigs-in-unreal-engine).

Now why use Control rigs? Honestly not too sure yet… My hope is that IK is easier (so I dont have to add IK bones) but adding these controls seems tedious, in addition your forward solve block I guess has to be a big tower of blueprint power?

I’m probably missing the plot here, but also my “forward solve” seems to be undoing my “backwards solve” where I assumed “backwards solve” was used for IK situations. I’m still playing around but the hope is that by using unreal internals I should be able to handle the physicality of the bots a bit better than I expect.

Also I had an gmmk numpad (https://www.gloriousgaming.com/products/gmmk-numpad-keyboard?srsltid=AfmBOoqt9KEojE6tva-cmlDcTDtw1XiBNFEktoFWoobeNvWKYGD8ZtL0 I got it on sale for $90 but I think its still crazy overpriced) which I wanted to use for ableton and never got working (the slider plus the knob weren’t configurable to midi in). So I installed a via flavor of QMK ( https://caniusevia.com/ ) which lets me configure my keyboard in browser.

Which would be amazing if: 1.) Firefox supported usbhid and 2.) If I could remap the slider. But right now its really good for me using OBS to record rather than the snipping tool which records at like 20fps.

So for example here’s the control rig I described in action, BUT there’s no OBS window visible!

Otherwise I think I’m still dead-set on remaking the patrol bot animations in unreal. Walking and reload might be the most annoying but mostly I want to be able to make quick animations without the huge headache of .fbx exporting and importing (even with the blender unreal addons https://www.unrealengine.com/en-US/blog/download-our-new-blender-addons , which work amazing for static meshes, kinda sketchy for skeletal). I kinda wish unreal had the option of slicing the animation list of an FBX and attempting bone resolution before importing. I really want to get this working because then my workflow stays in unreal for animating. Blender is still blows unreal out of the water for making meshes IMO but animations in Blender still seem hacky with the action editor.

There are a few things I’m actively annoyed with when it comes to control rigs (which I wont show you here because I’m still WIP with this one.) I’m also a straight up control rig novice so I bet as I learn these problems might be solved with better practices.

1.) You cant manipulate the control shapes in the editor preview window. Seems like that would be an easy addition and should match the same kind of workflow as the “hold control to bump physics asset” thing.

2.) Control rigs are effected by bones. This one I get WHY but it seems counter-intuitive that you would ever want to make a rig that is controlled by a parent bone. I get the idea of attaching to a bone in order to have like a small sub object (for example a turret on a large ship).

3.) When you add a control to a bone it adds it to the children of that Bone. This would be fine if #2 wasn’t a thing.

4.) Adding Default forward solve assignments are not automated. I bet I could find a python script to do this but still, that blueprint tower of power really can and should be made upon generating a new control for a bone.

Still gonna push ahead with the control rigs though…Modular rigs freak me out (https://dev.epicgames.com/community/learning/tutorials/Dd31/unreal-engine-modular-control-rig-rigging-with-modules) and seem to be used primarily for bipedal skeletons.

I Built A TF2 Mod (Built meaning compiled)

I have tried to build a source engine mod probably 5-6 times since I started programming in late 2000s early 2010s and I’m finally ahead of the curve and I was able to get a mod built before valve did something that wasn’t in the public c++ repos that broke everything! (See https://github.com/ValveSoftware/source-sdk-2013/tree/master)

This is not an accomplishment but I’m happy it’s possible. Will I do anything with this? Probably not….

What happens when I launch the mod??

uhhggg…. I could figure this out but honestly if I’m diving into source I’d rather start with source 2 and cs2. But there’s a 10 year old part of me that longs to make a source mod, put it on moddb, start a dev blog, abandon the mod, notice another team picked up my mod, start a mod that competes against the original and fall into a deep depression when the original mod team gets hired by valve.

One can only dream.

“DATA INCOMING!”

Made a data display using threes.js for a cool display for alerts over time. The thought was that the radial view would be displayed in a corner then would unravel as the user clicked the window to display a timeline of events.

There’s a hitch when converting from the polar plot to the linear plot which comes from me dynamically making text objects instead of having them in memory and hidden. Further work on this will be adding an optional screen space shader to make things look a bit more oscilloscope-y.