Home > On-Demand Archives > Q&A Sessions >
Live Q&A - How to Get the Bugs Out of your Embedded Product
Dave Nadler - Watch Now - EOC 2021 - Duration: 17:32
Thanks Anders for the kind words. I can't cite any references off-hand. Its closer to waterfall then some goofy stuff sometimes proposed. Using requirements to immediately define tests (before diving in and starting development) clarifies requirements and minimizes late surprises - do this and you will always find holes in the initial requirements. Decomposing top-level tests (which may only pass near project completion) into constituent/prerequisite tests helps define the lower-level interfaces needed for testing.
Hope that helps!,
Best Regards, Dave
Hi Dave,
Congratulations for an excellent presentation full of precious, hard-won lessons. I really hope that the younger fellow embedded developers are paying ATTENTION! Learning from experienced developers is much cheaper than repeating all the mistakes by yourself.
The most important, recurring message of your presentation is to develop as much of the embedded code as only possible ON THE DESKTOP. It's amazing to me how much your experience in this area matches mine. I also always start every embedded project with a PC-based "simulation" or a "prototype" of the device.
But unfortunately, this development strategy (also known as "Dual Targeting") is generally misunderstood and sometimes causes adverse reactions. For example, here is a link to my post on Statckoverflow "Prototyping and simulating embedded software on Windows". As you can read there, my post was generally misunderstood, criticized and finally closed as a "speculation".
Anyway, in your talk you mention wxWidgets as your prototyping platform. I'd like to second the requests from the Q&A session for you to publish an example device simulation with that GUI.
But if anybody is interested, an even simpler GUI prototyping platform called "QWin" is available for download NOW on GitHub as part of the QTools collection
The complete QWin code with a working example and a detailed Application Note explaining the process of simulating embedded devices on Windows (with the free Visual Studio) is also available:
The QWin prototyping platform is based directly on the Win32 API in C, so it binds directly to pure C code (as well as to C++ code). This is as simple as it gets and no third-party libraries or DLLs are needed. Still, QWin supports graphic LCD (monochrome and color), segmented displays, LEDs, bush-buttons as well as anything else you can put in a Windows dialog box.
Thanks Miro for the kind words. In https://www.embeddedonlineconference.com/meeting/Live_Discussion_How_to_Get_the_Bugs_Out_of_your_Embedded_Product I did mention your product. I suspect your Windows simulator perhaps frightens people off with the raw Windows API (though of course some of us have built products that way). We do prefer an OS-agnostic easier framework, latest one we're using being wxWidgets. Perhaps I should recode your QP Windows example using wxWidgets...
Best would be a PC framework (Windows and Linux etc) that provides:
- running from test scripts with result capture,
- capture of manually entered events into script for later playback
- compare run results (including things like LCD screen image)
This would give a great basis for CI/CD...
Thanks again for the kind words,
Best Regards, Dave
Hi Dave... I have a question... In software field exist tools that see the quality of the code developed showing the areas to improve... like a Sonar (https://www.sonarqube.org/)... Is there a tool focus on the quality of the code developed on embedded system?
I'm not aware of tools specific to embedded. I'm told some high-end tools can detect concurrency issues but don't know any specifics, sorry. Much static analysis is generic and will work fine for embedded. In the example project of this talk, I tried CPPcheck and it only found 1 of the top 12 bugs (in this example 6 of the 12 could theoretically have been found by static analysis). Hope this helps!
I'm OK with ASSERT() 'as long as' you can guaranty that the assert WILL be executed during testing. I've seen cases where programmers use asserts and they never create test cases where the assert is executed during testing and of course, some of these asserts can crash the system in the field which is not desirable.
As with many applications, this one is built with NDEBUG for target builds so no explosions on target possible. In Mars Perseverance presentation, IIRC Steve alluded to assert failure triggering a system reset. We don't want asserts to fire in regression testing; assert failure means application logic is incorrect and must be fixed. They are intended to help find and fix application logic errors as quickly as possible during development. So the assert should be executed but never fire during testing. Hope I explained my intent OK...
I agree with Jean in that assertions need to be tested, just as any other code. Or, actually, even more so, because they can bring down the whole system. I'm assuming here that assertions ARE enabled in the production software, as they should.
Great talk Dave, and one I'll be going back to watch again.
Just a +1 to the Q&A session plea for you to share a stripped back example of the simulator and application code if you can. Having a template/example makes it much easier to get started!
Excellent presentation!
10:42:34 From Leandro Pérez : Hello everyone... Good morning from Colombia 10:42:42 From Rocco Brandi : it seems that simulation is the next logical step after TDD. where can I find some resources (books, courses, tutorial, etc. ) that can help me start writing simulator? 10:43:25 From Al Anway : how did you know you'd found all the bugs in the FLARM product? 10:44:00 From patelk5 : What kind of bird is that? 10:44:50 From Steve Wheeler : It might be an African Grey parrot. 10:45:19 From Raul Pando : Hi Dave, effective code reviews are fundamental when they're available. Sometimes when working in a reduced team the level of expertise may be limited, are there any second best alternatives in scenarios where a developer can't have access to pier reviews? 10:46:13 From Bob Dowling : Definitely not a Norwegian Blue... HA! 10:47:02 From Leandro Pérez : I agree Raul... In my case the team always only I.... lol.... Sometimes is necessary answer to someone... but only I can resolve the problems that I found 10:50:38 From David Potter : Have you look at code analysis tools like Mathworks Polyspace code prover? Does it cover areas perhaps missed by simulation? 10:51:20 From Leandro Pérez : I have a question... I software exist tools that see the quality of the code developed showing the areas to improve... like a Sonar... Is there a tool focus on embedded system? 10:51:32 From Barry Robertson : Question: Any tips/suggestions for automated testing of touch-screen GUI interfaces - tools/techniques? Not simulating them but testing them. 10:52:16 From Rob Meades : A different approach to rubber ducking :-). 10:54:05 From Vinicius : Could you share the code used in the presentation? 10:56:30 From Radu Pralea : https://www.mattrobot.ai/ 10:56:48 From Bob Dowling : I'd be interested in the simulator code too. 10:57:13 From Gary : @Radu that's really cool 10:57:47 From Leandro Pérez : In software field exist tools that see the quality of the code developed showing the areas to improve... like a Sonar (https://www.sonarqube.org/)... Is there a tool focus on the quality of the code developed on embedded system? 10:57:58 From Radu Pralea : Yea, I saw this in action, it's quite impressing. Not sure about the cost :) 10:58:23 From Charles Miller : Sounds like a nice GitHub project that is an "example" project learners could use. If I only had time.... 10:59:15 From Rob Meades : Thanks Dave! 10:59:17 From Gerhard : Thanks! 10:59:22 From Rocco Brandi : thanks! 10:59:26 From Steve Wheeler : Thank you. 10:59:27 From Raul Pando : Thanks Dave 10:59:30 From Vinicius : Thanks!
Really great talk Dave. I learned a lot by you thinking out loud your dececions.
One question I have is regarding Test Driven Design, I'm familiar with Test Driven Development. But hadn't heard about this concept.
Could you give some example test cases for a Test Driven Design?
And at what point should they pass? I'm guessing only when the product is finished or near finished..