Systematic inventive thinking is a globally renowned framework consisting of five techniques for innovation: subtraction (including subtractive innovation), division, multiplication, task unification, and attribute dependency.
Subtractive innovation, involves removing or simplifying elements to improve a product, process, or system. Consumers often value simplicity over complexity, which is what makes subtractive innovation a potent technique.
GlobalData’s 2024 Consumer Survey revealed that 76% of respondents view simple ingredients as either ‘essential’ or ‘nice to have’ when making purchasing decisions.
Key takeaways
Subtractive innovation has driven watershed advancements across various industries. In 1993, Dyson killed the vacuum cleaner bag. The company has since become a mark of luxury, style, and technological dominance in the vacuum cleaner space.
Fast forward to the 21st century and Apple stands as the quintessential embodiment of subtractive innovation in the tech sector.
Apple’s iPhone 7 debut in 2016, announced alongside the first generation of AirPods, controversially killed the headphone jack. More than 300,000 people signed a petition lobbying against Apple’s move to “screw consumers and the planet”. Yet, AirPod sales are estimated to have totalled more than $18bn in FY2023 alone. AirPods could become Apple’s third most lucrative product behind iPhones and Macs over the next decade.
Targets are wired
Wired accessories are prime targets for subtractive innovation given their more convenient and sleeker wireless counterparts. Apple’s 2017 debuts of the iPhone 8 and iPhone X marked the introduction of wireless charging for iPhones (and in the case of the latter, the rather small matter of killing the home button).
It would be a classic move straight from Apple’s playbook to eventually eliminate charging jacks from its iPhones altogether.
Elsewhere, Tesla has been at the forefront of subtractive innovation in the automotive sector. The company has challenged conventional automotive designs by killing traditional buttons and stalks in favor of a minimalist design and a touchscreen interface.
AI and subtractive innovation
At the forefront of tech, generative AI is set to undergo a subtractive innovation movement.
GlobalData sees small language models (SLMs) as defining enterprise generative AI for the foreseeable future. SLMs are compact AI models trained for specific use cases, delivering high efficiency and strong relevancy.
SLMs typically have less than 10 billion parameters, compared to trillions for their large counterparts.
SLMs are also typically more accurate than LLMs for specialised tasks because they are specifically trained on refined and targeted application-specific datasets. They are also cheaper than LLMs and can be run on the edge—meaning data never leaves the local network. SLMs are a case in point that, as is typical of subtractive innovation, less often means more.