• 0 Posts
  • 50 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle


  • This would work but assumes the primary use of the machine is Windows and derates your performance under Linux significantly due to USB speeds. Even if you’re storing your data on the Windows HDD, NTFS drivers are dog slow compared to EXT4 and other *nix filesystems.

    Also some BIOSes are a pain to get to boot off removable drives reliably so it really depends on what your machine is.

    I’ve used Linux as a primary dev system for well over a decade now, and with the current state of Windows I’d really recommend just taking the leap, keep your Windows box if you need Windows software and build a dedicated Linux workstation.










  • These microplastics are digestible by your immune system, though, which makes them ultimately harmless. PLA is used for drug delivery for this reason.

    Being concerned about incomplete PLA degradation is like being concerned about a piece of wood breaking down into micro-woods. Yet even if you get a dangerous shard of micro-wood embedded in your skin, your body can deal with this cellose polymer just fine.

    Ultimately it will break down completely someday and in the meantime, nothing will be harmed.


  • I love the term “write-only code”, it’s perfect. I used to love Perl as it felt like it flowed straight from my brain into the keyboard. What a free and magical language.

    So it turned out I had ADHD. Took meds, went back to C/++ with renewed appreciation, haven’t touched Perl since as it horrifies me to look at it. What a nightmare of dangling references and questionable typing. Any language that allows you to cast a string to a function and call it really needs to sit down and think about what it’s doing.


  • True survivalist/libertarian types have always loved solar power.

    I don’t know how solar lost its space age coolness, though, aside from active lobbying from the fossil fuel industry to try to kill it. For awhile solar was undoubtedly the power source of the future, the same thing that was on our space probes and satellites.

    I have old oil-crisis era books and magazines on my shelf which absolutely loved solar power and billed it as the cheap energy solution for the common man. Somewhere we went wrong, and I think it was Reagan (in many ways…)



  • If you don’t want memory-safe buffer overruns, don’t write C/C++.

    Fixed further?

    It’s perfectly possible to write C++ code that won’t fall prey to buffer overruns. C is a lot harder. However yes it’s far from memory safe, you can still do stupid things with pointers and freed memory if you want to.

    I’ll admit as I grew up with C I still have a love for some of its oh so simple features like structs. For embedded work, give me a packed struct over complex serialization libraries any day.

    I tend to write a hybrid of the two languages for my own projects, and I’ll be honest I’ve forgotten where exactly the line lies between them.



  • A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

    Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

    I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.


  • I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

    It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

    It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

    It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

    AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.