A brainstorming session with a group of banks raised this question: are we going through a Kodak moment in banking? Are we seeing a Nokia change? Will banks miss the tipping point and die from the reformation of the internet, or will we respond and change in time?
As it’s Friday the Thirteenth, it’s a good time to discuss this question.
We all know the Kodak and Nokia stories of moving from Hero to Zero (for those that don’t, the stories are repeated in depth at the end of this blog with Kodak first and then Nokia).
So here are a few things we know:
- We have seen other industries decimated by digital – books, music, photography – and know that the same is happening to banking
- We know that the greatest asset to a bank is data but banks do not leverage their data assets: according to Forrester, only 3% of data is tagged and less than 0.5% analysed
- We know that banks are structured inefficiently in product silos that lack customer focus
- We know that we have legacy systems that are inefficient and need refreshment
- We know that cryptocurrencies are redefining the digitisation of money and currency
- We know that 1,000’s of companies are launching new innovative models of managing money and value
- We know that billions of dollars is being ploughed into these new companies to force change in the banking system
We know we have to change.
OK, so we know a lot and it is clear that this could be a Kodak or Nokia moment. Kodak invented the digital camera but thought film was still the future. How wrong were they? Nokia owned the mobile market but let Apple turn it into a cheap computer and steal their business. Why didn’t Nokia do something sooner? And banks are possibly going through the same. Or are they?
I qualify the change in banking, as most people talk about banking as a protected industry unlike any other. The regulatory, compliance, audit and governance requirements, combined with capital reserves that are massively onerous, mean that few can get into the banking game.
This has certainly proven true over the past 25 years. During the past quarter century, everyone forecast that banks were dead and would be disintermediated. Hasn’t happened. It’s why there are only a few big banks in most countries, and little competition.
However, bearing in mind the list of challenges we face above, is that a good reason to sit back and be complacent today? And what about all of these new fintech start-ups, will they change the business?
Most bankers are talk to today say yes. Or at least most bankers who understand the fintech change in our world, of which there are some. These bankers believe that cryptocurrencies are designed to wipe out the banks middle office processing structures; that P2P wipes out their credit product offers and more; and that the front-end relationship is being taken over by Apple Pay. These bankers believe Apple Pay will wipe out Visa and MasterCard over the next decade; that Bitcoin will replace SWIFT; and that they have to enable peer-to-peer connectivity for value exchange rather than act as the control freaks the banks have been for the past quarter century.
The trouble is that most of the bankers are not sitting in a decision making capacity. They are executors and implementers of digital strategies, and struggle to get their voices heard in the upper echelons of management.
The decision making executive team is more usually comprised of diligent banking people who have spent years dealing with regulations and compliance. They are the ones who believe that is the barrier to entry, and that they only need to change due to regulatory or competitive forces. They see digital as channel to market, and crypocurrencies as a game. They have little interest in listening to the digital crowd who, more often than not, are seen as geeks and nerds in the banking cellars.
More on this tomorrow, but the question is: who is right? Are the digital bankers screaming and shouting that we are going through a banking Kodak moment; or are their senior management right in saying we only need to change at the speed of the fastest competitor?
I know who my vote goes with, but then I’m sitting in the cellar of a bank right now as I write this …
How Kodak Failed from Forbes January 2012, written by Chunka Mui, coauthor of The New Killer Apps: How Large Companies Can Out-Innovate Start-Ups
There are few corporate blunders as staggering as Kodak’s missed opportunities in digital photography, a technology that it invented. This strategic failure was the direct cause of Kodak’s decades-long decline as digital photography destroyed its film-based business model.
A new book by Vince Barabba, a former Kodak executive, offers insight on the choices that set Kodak on the path to bankruptcy. Barabba’s book, “The Decision Loom: A Design for Interactive Decision-Making in Organizations,” also offers sage advice for how other organizations grappling with disruptive technologies might avoid their own Kodak moments.
Steve Sasson, the Kodak engineer who invented the first digital camera in 1975, characterized the initial corporate response to his invention this way: But it was filmless photography, so management’s reaction was, ‘that’s cute—but don’t tell anyone about it.’
Kodak management’s inability to see digital photography as a disruptive technology, even as its researchers extended the boundaries of the technology, would continue for decades. As late as 2007, a Kodak marketing video felt the need to trumpet that “Kodak is back “ and that Kodak “wasn’t going to play grab ass anymore” with digital.
To understand how Kodak could stay in denial for so long, let me go back to a story that Vince Barabba recounts from 1981, when he was Kodak’s head of market intelligence. Around the time that Sony SNE +1.13% introduced the first electronic camera, one of Kodak’s largest retailer photo finishers asked him whether they should be concerned about digital photography. With the support of Kodak’s CEO, Barabba conducted a very extensive research effort that looked at the core technologies and likely adoption curves around silver halide film versus digital photography.
The results of the study produced both “bad” and “good” news. The “bad” news was that digital photography had the potential capability to replace Kodak’s established film based business. The “good” news was that it would take some time for that to occur and that Kodak had roughly ten years to prepare for the transition.
The study’s projections were based on numerous factors, including: the cost of digital photography equipment; the quality of images and prints; and the interoperability of various components, such as cameras, displays, and printers. All pointed to the conclusion that adoption of digital photography would be minimal and non-threatening for a time. History proved the study’s conclusions to be remarkably accurate, both in the short and long term.
The problem is that, during its 10-year window of opportunity, Kodak did little to prepare for the later disruption. In fact, Kodak made exactly the mistake that George Eastman, its founder, avoided twice before, when he gave up a profitable dry-plate business to move to film and when he invested in color film even though it was demonstrably inferior to black and white film (which Kodak dominated).
Barabba left Kodak in 1985 but remained close to its senior management. Thus he got a close look at the fact that, rather than prepare for the time when digital photography would replace film, as Eastman had with prior disruptive technologies, Kodak choose to use digital to improve the quality of film.
This strategy continued even though, in 1986, Kodak’s research labs developed the first mega-pixel camera, one of the milestones that Barabba’s study had forecasted as a tipping point in terms of the viability of standalone digital photography.
The choice to use digital as a prop for the film business culminated in the 1996 introduction of the Advantix Preview film and camera system, which Kodak spent more than $500M to develop and launch. One of the key features of the Advantix system was that it allowed users to preview their shots and indicate how many prints they wanted. The Advantix Preview could do that because it was a digital camera. Yet it still used film and emphasized print because Kodak was in the photo film, chemical and paper business. Advantix flopped. Why buy a digital camera and still pay for film and prints? Kodak wrote off almost the entire cost of development.
As Paul Carroll and I describe in “Billion-Dollar Lessons: What You Can Learn From The Most Inexcusable Business Failures of the Last 25 Years,” Kodak also suffered several other significant, self-inflicted wounds in those pivotal years:
In 1988, Kodak bought Sterling Drug for $5.1B, deciding that it was really a chemical business, with a part of that business being a photography company. Kodak soon learned that chemically treated photo paper isn’t really all that similar to hormonal agents and cardiovascular drugs, and it sold Sterling in pieces, for about half of the original purchase price.
In 1989, the Kodak board of directors had a chance to take make a course change when Colby Chandler, the CEO, retired. The choices came down to Phil Samper and Kay R. Whitmore. Whitmore represented the traditional film business, where he had moved up the rank for three decades. Samper had a deep appreciation for digital technology. The board chose Whitmore. As the New York Times reported at the time: Mr. Whitmore said he would make sure Kodak stayed closer to its core businesses in film and photographic chemicals.
Samper resigned and would demonstrate his grasp of the digital world in later roles as president of Sun Microsystems and then CEO of Cray Research. Whitmore lasted a little more than three years, before the board fired him in 1993.
For more than another decade, a series of new Kodak CEOs would bemoan his predecessor’s failure to transform the organization to digital, declare his own intention to do so, and proceed to fail at the transition, as well. George Fisher, who was lured from his position as CEO of Motorola to succeed Whitmore in 1993, captured the core issue when he told the New York Times that Kodak “regarded digital photography as the enemy, an evil juggernaut that would kill the chemical-based film and paper business that fueled Kodak’s sales and profits for decades”.
Fisher oversaw the flop of Advantix and was gone by 1999. As the 2007 Kodak video acknowledges, the story did not change for another decade. Kodak now has a market value of $140m and teeters on bankruptcy. Its prospects seem reduced to suing and others for infringing on patents that it was never able to turn into winning products.
Addressing strategic decision-making quandaries such as those faced by Kodak is one of the prime questions addressed in Vince Barabba’s book, “The Decision Loom.” Kodak management not only presided over the creation technological breakthroughs but was also presented with an accurate market assessment about the risks and opportunities of such capabilities. Yet Kodak failed in making the right strategic choices.
This isn’t an academic question for Vince Barabba but rather the culmination of his life’s work. He has spent much of his career delivering market intelligence to senior management. In addition to his experiences at Kodak, his career includes being director of the U.S. Census Bureau (twice), head of market research at Xerox, head of strategy at General Motors (during some of its best recent years), and inclusion in the market research hall of fame.
“The Decision Loom” explores how to ensure that management uses market intelligence properly. The book encapsulates Barabba’s prescription of how senior management might turn all the data, information and knowledge that market researchers deliver to them into the wisdom to make the right decisions. It is a prescription well worth considering.
Barabba argues that four interrelated capabilities are necessary to enable effective enterprise-wide decision-making—none of which were particularly well-represented during pivotal decisions at Kodak:
1. Having an enterprise mindset that is open to change. Unless those at the top are sufficiently open and willing to consider all options, the decision-making process soon gets distorted. Unlike its founder, George Eastman, who twice adopted disruptive photographic technology, Kodak’s management in the 80’s and 90’s were unwilling to consider digital as a replacement for film. This limited them to a fundamentally flawed path.
2. Thinking and acting holistically. Separating out and then optimizing different functions usually reduces the effectiveness of the whole. In Kodak’s case, management did a reasonable job of understanding how the parts of the enterprise (including its photo finishing partners) interacted within the framework of the existing technology. There was, however, little appreciation for the effort being conducted in the Kodak Research Labs with digital technology.
3. Being able to adapt the business design to changing conditions. Barabba offers three different business designs along a mechanistic to organismic continuum—make-and-sell, sense-and-respond and anticipate-and-lead. The right design depends on the predictability of the market. Kodak’s unwillingness to change its large and highly efficient ability to make-and-sell film in the face of developing digital technologies lost it the chance to adopt an anticipate-and-lead design that could have secured the it a leading position in digital image processing.
4. Making decisions interactively using a variety of methods. This refers to the ability to incorporate a range of sophisticated decision support tools when tackling complex business problems. Kodak had a very effect decision support process in place but failed to use that information effectively.
While “The Decision Loom” goes a long way to explaining Kodak’s slow reaction to digital photography, its real value is as a guidepost for today’s managers dealing with ever-more disruptive changes. Given that there are few industries not grappling with disruptive change, it is a valuable book for any senior (or aspiring) manager to read.
Chunka Mui is the coauthor of The New Killer Apps: How Large Companies Can Out-Innovate Start-Ups. Follow him at Facebook, Twitter @chunkamui or at Google+.
There was once a time when my search for a new phone would start (and likely finish) with a visit to Nokia.com. The Finnish company had the widest choice, the best designs, and the most respected brand around the world, so it was pretty hard to pick a bad phone from its catalog. Try doing the same thing today, however, and you’ll find every link on the Nokia homepage pointing to Microsoft’s Mobile Devices division — the new incarnation of the Nokia most of us knew and loved. It’s a vastly different mobile world we’re living in now, but what’s most striking about it is that Nokia saw it all coming.
The best phone in the world today is dressed from head to toe in aluminum and has an outstanding camera that protrudes from its body. So did the Nokia N8 in 2010. The iPhone that’s collecting all the plaudits and sales now is basically the fulfilment of a vision Nokia had half a decade ago: combine the best camera with the best build materials and let others try to match you. The only thing Nokia didn’t do right with that phone was its software. The N8 design was ready to go in early 2010, when it would have been among the first with 720p video recording, but repeated delays of the new Symbian version pushed its release to September. The hardware was getting kneecapped by the software, which a Nokia employee told me at the time was being developed in separate silos that wouldn’t be integrated into a single operating system until the final weeks before launch.
“It’s big!” He says with a smile. “But it’s also beautiful and very thin this time.”
No, those aren’t the words of Apple’s Phil Schiller describing the iPhone 6 Plus; they are the proclamations of Anssi Vanjoki while presenting the Nokia E7 at Nokia World 2010. The E7’s 4-inch screen was considered large for its time, but Nokia knew where our preferences were heading. Watch the rest of its presentation from that September 2010 gathering and you’ll also hear of personalized location-based services not unlike Google Now. “And it is a space that we intend to own,” said Executive VP Niklas Savander at the time. As wildly optimistic as that may sound in hindsight, it was a justifiable ambition for a company that was the leader in mobile mapping and navigation services, even if its software left something to be desired.
NOKIA’S BIGGEST FAILURE WAS AN UNWILLINGNESS TO EMBRACE DRASTIC CHANGE
Nokia’s biggest failure was an unwillingness to embrace drastic change. The company sowed the seeds for its self-destruction when it made “the familiarity of the new” the tagline for its big Symbian upgrade those many years ago. It feared alienating current users by changing too much, so it ended up with a compromised mess of an operating system that wasn’t fit for the future. Even as it was making one mistake, however, Nokia was keenly aware of the threat of another.
Jumping to Android was widely advocated as a quick shortcut to making Nokia’s software competitive, but Anssi Vanjoki dismissed that idea as a short-term solution that was no better than “peeing in your pants for warmth in the winter.” I was among those who thought him wrong, but the recent financial struggles of HTC, Motorola, and Sony have shown him to be more prophetic than paranoid. Nobody outside of Google, Samsung, and Microsoft (by virtue of patent royalty payments) is making real money off the sales of Android phones.
THE NOKIA N9 WAS A REVELATION, BUT MEEGO WAS NEVER GIVEN A SECOND CHANCE
Eventually, Nokia’s hand was forced into making a switch and it chose Microsoft’s Windows Phone as the platform to build its future on. That construction project is still going on, though it no longer carries the Nokia name. Before Windows Phone, we got a glimpse of what might have been with the introduction of the Nokia N9. It ran the open-source MeeGo OS that Nokia was developing as a successor to Symbian, and it infused a breath of fresh air into both hardware and software design for phones.
The N9’s unibody was so desirable to look at and delightful to the touch that it spawned a family of Windows Phone progeny that continues today with the Lumia 730. The multitasking overview and switcher of the N9 has also been widely emulated in devices of all creeds and operating systems, and so has its double-tap-to-wake functionality. That phone was, and remains, a revelation. Still, MeeGo development wasn’t proceeding as quickly as new Nokia boss Stephen Elop had wished, and there was no app ecosystem to speak of, so the N9 and its kind were banished in favor of the more pragmatic Microsoft-led approach.
Nokia’s 2007 vision of the future was remarkably similar to Apple’s.
The list of things Nokia saw coming but failed to adapt to is regrettably long. Another instance where Anssi Vanjoki seemed to exaggerate was when he predicted that DSLRs would be replaced by cameraphones. I mocked his outlandish claim then, but to look at the new iPhones, the Panasonic CM1, and Nokia’s own Lumia 1020, there are now certainly enough excellent options to make at least a few people drop the bulky dedicated camera. Nokia is, of course, not unique in its anticipation of future trends, but it has been better at it than its epic fall from dominance would suggest. Like Palm with webOS, Intel with Mobile Internet Devices, and even Xerox with the graphical user interface, Nokia has repeatedly demonstrated that being first to a good idea is no guarantee of commercial success.
BEING FIRST IS NO GUARANTEE OF SUCCESS
Before the iPhone had apps and Android got Maps, Nokia phones had both. Today there isn’t a flagship smartphone without either a metal or faux metal finish. If only Nokia’s software were as good as its foresight and hardware, my visits to its homepage would still be producing the same frisson of excitement that they once did. Instead, I’m left staring vacantly at the four squares of the Nokia apocalypse that join up to form the Microsoft logo.
Powered by WPeMatico