- The Tech Lead
- Posts
- Was Nissan Ruined By AI?
Was Nissan Ruined By AI?
A dangerous lesson for all engineers.
When we get into our cars,
we entrust it with our very lives.
So, seeing a major vehicle manufacturer get called on the carpet for an obvious software faux pas is concerning.
What was this error?
Why does it deserve attention?
Does it compromise consumer safety?
Why does it unearth concerning trends for software at large?
To the last question, I’ll whet your appetite with two dangerous words:
Tutorials and AI.
Those words should raise red flags.
The Error
The year: 2016.
The landscape: no AI code generators.
The problem: an apparent copy-and-paste error.
It started as a tweet from Scott Helme on Twitter on May 4, 2016. It shows a screenshot from the Location Services screen representing the NissanConnect® EV iOS app.
If you have ever built a mobile application that requests access to sensitive user information, you know well that it is essential to do this right to retain user trust.
Nissan did not.
Here is the tweet in question:
A screenshot from the latest version of the Nissan ConnectEV app...
— Scott Helme (@Scott_Helme)
12:10 PM • May 4, 2016
Underneath the permission access setting is a spot for the app developers to explain why this permission is being requested and rationalize how it is used.
Nissan’s explanation:
The spirit of stack overflow is coders helping coders
What does that mean?
Absolutely nothing. Unless you understand what “stack overflow” [sic] is.
What Went Wrong
Stack Overflow is the #1 online forum for software engineers to ask difficult programming questions to get answers from the community.
It’s alive.
It’s helpful.
It’s the place to be.
It’s also a dangerous place to tread without thinking.
This exact quote in the production Nissan app came from this question on Stack Overflow about prompting a user for access to their location in an iOS app.
The exact phrase that appears from an example answer in 2015.
The answer was intended to be an example.
A template.
A general direction.
Something you would change.
However, the Nissan engineer copied and pasted the solution verbatim.
In English class, this would be called plagiarism.
In software engineering, this is called gross negligence.
And that’s why it deserves attention.
Because we all do it.
The Impact
While it is easy to point fingers at a familiar brand when said brand makes a mistake, it hints at a much larger problem.
One could speculate what really went wrong.
Did the engineer simply not care?
Did the engineer forget to change it?
Did the engineer even know what he/she was doing?
I do not wholly blame the engineer.
The engineer should have caught that.
And the quality team.
And the PO.
But only the consumers caught it.
Stack Overflow’s Problem
Copying and pasting is a luxury that the typewriter generation never had.
It’s also a double-edged sword they’ve wholly dodged.
And engineers are still guilty of this.
Myself included.
Stack Overflow is a great place to look for solutions. However, those solutions are often not tailored precisely to your needs, requiring bits and pieces from multiple solutions to solve the overall problem.
Overall, it can be difficult, but not impossible, to wholesale copy and paste a perfectly tailored answer that fits your needs.
Those kinds of turnkey answers are just too rare. Most will get you 50%, 80%, or even 95% there, with you filling in the rest.
Nevertheless, engineers would still copy and paste without questioning.
The Consumer Mindset
Aleksandra Sikora, a Polish engineer and blogger, wrote that this problem stems from a consumer-based approach.
In her article, she stated the crux of the problem:
Over the past few years, as I’ve been working and talking with many developers, I noticed a repetitive pattern.... This pattern is consuming — instead of creating. Consuming — without questioning.
In this article from 2020, before the AI boom, she blamed laziness and naïve trust in “tutorials (which) show harmful patterns.”
Tutorials serve a point.
Tutorials are how I got started.
Tutorials are how many others got started.
But once you understand the basic concepts, they should only live to serve as guidelines to further your knowledge, not a runbook of cardinal rules from the nameless experts who wrote them.
Software is evolving.
Software has very few experts.
And experts still make mistakes.
Everything deserves evaluation and thought in software, not blind trust.
AI’s Bigger Problem
AI’s allure goes beyond that of Stack Overflow and tutorials.
AI can create a tailored solution that exactly fits your question. The temptation to copy and paste said solution without cross-checking is strong.
Ironically, the Stack Overflow Podcast discussed this very problem in episode 683: Is AI Making Your Code Worse?
In short, there is nothing wrong with copying and pasting. But you must be sure to know that:
You understand it thoroughly.
You adjust any incorrect suggestions.
The QA team thoroughly reviews the implementation.
You get a sign-off from the product owner that this behaves as expected.
You can’t outsource due diligence.
AI is nothing different from an intelligent colleague. Use its answers to guide you.
Always cross-check.
Always understand it.
Always use this power wisely.
The Future
Stack Overflow, tutorials, and AI.
These are the tools in the modern engineer's toolbox. They all have value but can also be misused, like a misguided hammer hitting a thumbnail.
Can these tools and the mistakes they encourage compromise user safety and application quality?
Absolutely.
But so can inexperience.
Or poor planning.
Or a bad day.
Nothing can save you from laziness. Hard work looks the same now as it did before ChatGPT and Stack Overflow.
It’s up to the team to be just as aware as it should have been 5, 10, 15, and 20 years ago to pull together, hold each other accountable, and build great software.
You’ll be glad you did.
And your users will trust you for it.