2 Game Changing Trends that will Define Embedded Systems in the 2020’s

The last decade has seen an amazing advancement in embedded system development techniques, tools and technologies. A decade ago, I recall being amazed when a microcontroller had a clock speed above 48 MHz with an unbelievable 128 kilobytes for flash. Today, we now have microcontrollers with clock speeds above 1 GHz with more than 4 megabytes for flash storage that makes even my first personal computer jealous. This dramatic increase in capabilities for microcontrollers and their affordable costs is going to usher in a completely new design paradigm in the decade to come. Let’s examine three trends in embedded systems development that I believe will prove to be game changers in the 2020’s.

Trend #1 – The Rise of Python as a Dominant Language

Python is already the most popular programming language used by software developers outside the embedded systems industry. In fact, a survey conducted this year by IEEE verified that amongst engineers, Python is the number one programming language followed by Java and then C1. The Aspencore 2019 Embedded Markets Study also found that in the last two years, the number of projects programmed in Python in the embedded space has doubled2! (Keep in mind the study also found that there was no change in the number of projects using C). So, what is it about Python that makes me think it will become a dominant language for embedded systems?

First, as I discussed in the introduction, the compute power available in microcontrollers has grown to the point where a stripped-down version of a Python kernel can be ran on a microcontroller that costs only a few dollars. Second, there are already popular open source ports for Python such as MicroPython that are available on more than a dozen architectures including popular ones like the STM32 and the ESP32.

Third, C and C++ aren’t taught in most computer science or engineering programs. It’s now Python and some Java and has been for quite some time. This means that there is and will be a whole generation of engineers taking the lead in the next decade who have a natural inclination to using Python.

Finally, as I attend conferences, talk with prospects and colleagues, I’m already seeing a natural pull to use Python. No one wants to fight with the low-level hardware and software anymore. They want their microcontroller to come running something that they can put their application specific code on quickly and get their product to market. Forget worrying about registers, pointers and all the traditional embedded stuff. Plus, if Python is used, anyone can help develop the product, not just those embedded folks.

Ready or not, here Python comes!

Trend #2 – Machine Learning at the Edge

I really wanted to avoid having machine learning as a game changing trend for the upcoming decade. I feel like the hype around machine learning is enormous. I can’t open a newsletter or read a blog (or apparently write one) without machine learning showing up. The fact though, is that machine learning holds a lot of potential for embedded systems developers as we begin a new decade.

Machine learning for embedded developers, as it currently stands, has the greatest potential at the IoT edge. Up until recently, machine learning was done somewhere “out there” and it had little if anything to do with embedded developers. Remember though in my introduction when I discussed the rapid advancements in hardware technologies for microcontrollers? These advances are making it far easier to run machine learning inferences on a microcontroller.

Running the inference on the embedded controller at the edge opens a whole range of local applications and can save on bandwidth and communication costs with the cloud. One area that seems particularly primed for machine learning at the edge is embedded vision. The ability to perform object detection and recognition at the edge has so many potential opportunities for business applications and for developers to lighten their workload.

The vast amount of data and libraries that are currently available will make it very easy to train new machine learning models. Even as I write this there are teams of specialists working on how to optimize tools and libraries so that the inferences can run on embedded controller. In fact, we are already at the point where you can run an inference on an Arm Cortex-M4 processor. I know that we are getting tired already of talking machine learning, but the industry is just getting started for us embedded systems engineers.

Conclusions

The next decade has the potential to dramatically change the way that products and embedded systems are developed. In this post, we’ve explored the two game changing trends that I believe will have the biggest impact on embedded systems developers. There are certainly many other trends that will see in the 2020’s, but we will have to save those for another time. I think in the next decade will find the majority of applications will not just use Python, but also machine learning.

What do you think, will Python and Machine Learning be the two major game changers in the coming decade?

References

  1. https://spectrum.ieee.org/computing/software/the-top-programming-languages-2019
  2. https://www.embedded.com/wp-content/uploads/2019/11/EETimes_Embedded_2019_Embedded_Markets_Study.pdf

Share >

9 thoughts on “2 Game Changing Trends that will Define Embedded Systems in the 2020’s

  1. Python: If your goals are basically a maker mentality of getting something out the door, bugs be damned, then Python (and Java) are good languages for that.

    They are horrible languages for embedded development however as they promote bugs easily. Nasty ones that I learned about when programming in FORTRAN 40 years ago that involve typos.

    So the problem here is in getting these future programmers into a mindset of professionalism and craftsmanship rather than the maker mentality.

    Anyone can make one of anything. Making a product is a whole lot harder.

    • I think we have to be careful though to consider anyone who writes Python to be a maker. Python has very strict guidelines on how to properly design and architect code. I’ve also found that Python developers, at least the ones I interact with, are far better at creating automated tests and performing regression testing than most C programmers I know.

      I think you’ve hit it on the head when you mention that it is the mentality of the developer. Thanks for the comments!

  2. The other thing that is bad in embedded is heap space. MISRA has rules against it for just reasons.

    Not only do you get memory leaks but when you can’t control very tightly *when exactly* you’re doing garbage collection (which both Python and Java implicitly live on), then you need way more hardware than is required to do the same thing. Basically, you have to compensate for the now non-real time framework you now live with to do real time by throwing way more hardware at the problem than is required. Usually this isn’t so much a cost thing as other issues like power consumption.

    • Thanks for the comment. I certainly don’t disagree with you, but I also think that there are a lot of applications out there where this is becoming an option. I think its also interesting to remember that OS’s like embedded Linux make use of heap space and many of these systems operate just fine. In fact, if you look at SpaceX’s Falcon rockets, I hear they are all completely coded in Python and have been very successful at handling their real-time requirements. (I know a slightly different scale in processing power).

      I have had a lot of these some concerns with several of the embedded Python languages, but explored and watched how MicroPython has evolved over the past 3 – 4 years I find that I have fewer and fewer concerns. I hold MISRA as a recommended standard for all embedded development, but we should keep in mind that is designed for safety critical applications. There are plenty of non-safety applications that this could apply to and I certainly don’t think Python will ever be used for those types of systems.

      Interestingly these embedded Python ports often also include API’s for controlling garbage collection and managing energy consumption. I completely agree there are certainly issues that anyone trying to use embedded Python would have to carefully work through especially for a production system.

      • Hello sir, I don’t think falcon rockets run python. They run linux with c and c++ programs. They use css for touch screen UI interface. Here you can see all those things https://www.google.com/amp/s/techfastly.com/falcon-9-software/amp/
        But you are right sir, nowadays python is get more used in embedded systems. Now, I’m also developing a project with python on raspberry pi zero w and I have seen a coffee vending machine uses raspberry pi 3 with touch screen UI runs python and kivy to do operations.
        Thank you for running these kinds of blogs.

    • Embedded Python currently is memory hungry, but when you look at how businesses need to get to market ASAP, the cost and time to develop an application in C, the decreasing cost of memory and microcontrollers along with the increase in capabilities, I think it will be a very interesting option for developers. In fact, you can take an ESP32 which is a Wi-Fi / BT combo chip that costs just a few dollars and put MicroPython on it and develop a full sensor node application in just a few days. It certainly won’t be as efficient as writing in C, but efficiency isn’t everything and the ~350 kB of flash required for some of these solutions is becoming negligible.

      Thanks for the comment and I appreciate your thoughts on this! (I think the topic is very intriguing!)

  3. Python is great for small scale projects, but for industry standard C will keep its position for a long time.

    What do you think about RUST?

    • C will keep its position for a long time, but I can see a large number of systems in various industries moving completely to Python. For example, Industrial IoT has no reason to use C, in fact using C would be a detriment to cost and time investment. On the other hand, anything safety critical will obviously stay C for a long time to come.

      I think RUST is over hyped, at least for now. It’s extremely interesting but there are currently no standards and I think Python is more likely to gain market share than RUST. Only time will tell though.

Leave a Reply to Jacob Beningo Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.