Interoperability & the Future of Disruption | EPAM Continuum

In complicated spaces such as education, healthcare, and government, walled gardens and incompatible systems undermine both the basic utility and promise of technology to solve problems–eased communication, greater understanding through data, or transformation at scale. As big tech moves into these fields and regulatory consensus builds around interoperability, the industry is reaching a turning point on what it means to be disruptive.

This year, technical and bureaucratic infrastructure across public and private sectors crashed under the extraordinary demands created by the pandemic. The US public health system found its coronavirus response hampered by a hodgepodge of private players (big national laboratories, small labs, doctors’ offices, and hospitals) reporting data without standards and via disparate methods, some deeply inefficient and prone to error, including astoundingly,

This year, technical and bureaucratic infrastructure across public and private sectors crashed under the extraordinary demands created by the pandemic. The US public health system found its coronavirus response hampered by a hodgepodge of private players (big national laboratories, small labs, doctors’ offices, and hospitals) reporting data without standards and via disparate methods, some deeply inefficient and prone to error, including astoundingly, fax machines.¹⁷ Hong Kong found itself unable to distribute coronavirus funds to citizens due to lack of a central database.¹⁸ The city also had difficulty sharing health data between different agencies due to dated IT systems, and like the US, officials and administrators often resorted to analog channels.¹⁹

Early on, governments everywhere, from the UK to India to Qatar, launched hastily developed contract tracing apps that created huge cybersecurity and privacy risks.²⁰ In an almost comical twist, when countries switched to the Apple-Google solution, many officials expressed concern that the strict privacy settings would make it impossible to know whether the apps were effective.²¹ In the US, states were originally supposed to build their own apps using the Apple-Google system: That didn’t work out, but the siloed and potentially incompatible state-by-state systems would have undermined the value of the apps’ premise anyway.²²

Almost all of these failures are, in small or large part, stories of interoperability and data standards. That’s not a coincidence–by nature, epidemiology requires the collection of vast amounts of population-scale data, and pandemic response requires a concerted, collaborative effort. Old and/or neglected systems–like New Jersey’s notorious COBOL-dependent mainframes–needed to work with new technologies, and data needed to be shared between public and private and from one agency to another.

But governments are not players in a parable, realizing too late the value of working together. Regulators around the world were already trying to solve these problems. The pandemic has just added more visibility and urgency to pre-existing initiatives, many of which will be coalescing in 2021.

The European Commission released a new data strategy last February, and plans to introduce a Data Act next year to further advance data standards and interoperability.²³ Both Mozilla and Harvard’s Shorenstein Center on Media, Politics, and Public Policy released papers within the last year²⁴ʼ²⁵ that put standards and interoperability at the center of future tech competition policy. In October, the US House Judiciary Committee published a report that recommended interoperability and data portability as solutions to antitrust in tech.²⁶

In the US, healthcare interoperability guidelines go into effect next summer, and the FDA is trying to bring its policies around data sharing and use in line with the current state of technology, which deeply impacts the pharmaceutical industry.²⁷ Analysts from Frost & Sullivan predicted this year that the global market for healthcare interoperability will double by 2024, largely driven by regulation and policy requirements.²⁸

Interoperability is an issue that spans all industries, not just healthcare. Farmers and other experts say it’s one of the biggest issues holding back agtech,²⁹ʼ³⁰ where users have a history of conflict with closed technical systems. Interoperability is the lynchpin of decentralized finance, a movement at the vanguard of fintech.³¹ It’s also a known issue in edtech, even if no one in education really has time to think about it this year.³²

Government, healthcare, finance, agriculture, and education are all fields that struggle with legacy IT. They’re spaces with entrenched, complex systems, and real problems. They’re all frontiers for big tech.

In the past, innovation in the high-tech private sector meant going around established systems (for instance, the taxi industry). It was, as it is now cliché to note, about moving fast and breaking things. But tech is increasingly trying to get involved in the kinds of problems that require working within messy entrenched systems.³³ And as is appropriate for working on challenges that involve collaboration between many organizations and stakeholders, the discourse of tech culture has evolved from breaking things to building them.³⁴

Incompatibility has always been used as a competitive moat (think of the ‘walled gardens’ of telecom giants in the pre-smartphone era), but the mechanism that keeps users from switching also holds technology back from some of its true transformative, problem-solving potential. Take wearable health devices: These marvels of medical software and electrical engineering have been building towards this moment for years, but when we most need them for remote clinical trials, the lack of cross-device data standards renders them almost useless for research.³⁵ It’s been a decade since Marc Andreessen wrote that software was eating the world, but it’s only ever been true for some industries, and some problems. Society is an ecosystem; for software to fully integrate into the world, it must be able to work with it.

Society is an ecosystem; for software to fully integrate into the world, it must be able to work with it.

Modern Architecture as Legos

Matt Bradbeer
Director, Client Partner
London, UK

Legos are great. You can strictly adhere to the instructions or mix it up safe in the knowledge that each brick, wheel, twiddly bit, or baseboard will work seamlessly together. Interoperability is the key, across sets and across eras. With a little bit of planning you can create whole worlds that are driven by your imagination. Anyone can play with Lego–the rules are open and accessible to all with consistent standards.

Modern architecture (also known as MACH) is the Lego of the digital world. Businesses across all industries, B2C and B2B, are adopting MACH as an approach to achieve their business goals and solve the challenges they face engaging their customers or empowering their colleagues. Again, interoperability is the key here, as APIs combined with microservices are what allow the components to work together, across several vendors and in a neatly composable manner.

Consider a scenario where a business cannot quickly change their digital customer experience touchpoints without a full-scale release or having to think about the capabilities of backend systems. Many legacy ecosystems are “hard-wired,” and you cannot alter one element without impacting another. A customer experience layer change may require a significant release and impact a back-end system. This is the equivalent of building a Lego house and supergluing everything together.

A MACH approach can be used to detach (or unstick) components, such as the customer experience layer. Using APIs that take data and content from the rest of your technology ecosystem and “detaching” them, your business can now change the customer experience layer without the previous overhead.

Or consider a scenario where a business has an eCommerce platform that’s responsible for more than just commerce, such as content management, front-end, and order management. Maybe this platform is getting old or needs an expensive upgrade which has prompted a discussion around replacement. While a “big-bang” replacement looks risky and expensive, a business can over time replace each element of the ecosystem with an API-driven alternative, de-risking the whole process.

MACH, by definition is cloud-native SaaS. Which means no upgrades—or at least not the sort that require massive projects, take months, and cost a lot of money. Each component of a business that is powered by a MACH solution will always be the latest version. So changing those old style ‘70s Lego bricks for the latest 21st century ones is painless.

Finally, MACH technologies empower organizations to adopt more effective and efficient ways of working. Product-led roadmaps, removing silos, cross-functional teams, and shared KPIs are some of the more obvious changes. IT and development teams are able to focus on creating new capabilities that empower colleagues and enable better customer experiences. Instead of fighting over the best bricks and arguing over what color the roof should be, it becomes a harmonious, efficient creative process where colleagues and customers benefit—and so does the bottom line.

Arnold, Automation, and Insured Assets

Michael Nicholls
Principal, Financial Services
London, UK

Whenever I think of robots two images come to mind, both from the 80's: Arnold Schwarzenegger in Terminator, and Japanese car manufacturing. I have halcyon memories of those days. I recall writing my first line of code at a bank in London in 1979. Bankers still wore bowler hats, carried an umbrella and a folded copy of the FT, and insurance underwriters and brokers wheeled around big trolleys of paper files as they negotiated each policy.

Fast forward to 2020. After decades of working in financial services, there are no bowler hats in sight, fewer bankers, and trolleys are much less evident, with files having been replaced by laptops. Insurers now have a Head of Automation, wizards who promise magical cost savings of 40% to 70%, all tantalizingly in reach!

How did they create such magic? Mainly with robotic process automation (RPA). RPA is used by every insurer, broker, and market participant on the planet to reduce costs, improve service, and increase revenues. As it says on the tin, RPA can reduce costs in various ways, including transcribing data from scanned documents or other electronic sources, automating manually intensive and complex processes, while ensuring consistency and highly accurate and scalable processing. Oh, and the RPA bots work 24/7, and never take holidays or sick leave.

Cost efficiency has always been an 'easier' metric to track and report, thus the temptation to rush through these RPA cost-reduction initiatives is almost irresistible. Quick wins in back-office processes (typically finance/HR and internal IT) were easy targets–and then claims handling and other manually intensive support processes. The approach was the same: Take a process, re-engineer, and automate with RPA.

Then the data geeks took a look. A frown developed. Processes being changed meant data mapping changed. Suddenly the data landscape was fragile, RPA began 'falling over,' processes no longer worked together, and systems crashed. The wizards then cast a new spell: “API…. data… orchestration… big data… machine learning… natural language processing… artificial intelligence…”

A remarkable thing about insurance is that it’s built on nothing more than a promise–bought by people who don't want it, and never want to use it. Most of us are lucky enough to go decades never needing to make a claim and forget all about it–other than the direct debit each month. Then something awful happens, and for many, it becomes the first time they read their policy.

This is the ultimate moment of truth for the customer relationship. This is where automation can reduce costs, improve outcomes, and cement client relationships. Every claim requires matching to policy terms, an evaluation of the claim, and estimation of the amount involved. All with a customer waiting at the end of the phone, sometimes with their entire livelihoods at stake.

There are few businesses where data dominates the entire value (and supply chain) like insurance, yet many insurers have struggled to adopt and exploit RPA. Nevertheless, automation at every stage and every interaction can reduce costs and give the client better service–if done properly. So, next time you see Arnold, imagine him as a force for good, taking costs out of the process and helping customers and insurers interact more fairly.

Of course, it’s not just RPA changing the face of insurance. Machine learning, big data, artificial intelligence and natural language processing are also causing seismic changes, but I will have to leave that till next time. As The Terminator would say–"I'll be back!"



The number of different electronic health record (EHR) platforms in use within the average health system. 75% of hospitals have to contend with more than 10 different EHRs amongst their affiliate practices.

Source: HIMSS Analytics via HealthcareITNews

Learn More

Key Takeaways


17New York TimesBottleneck for U.S. Coronavirus Response: The Fax Machine07.13.2020Source
18South China Morning PostWhy Hong Kong can’t disburse coronavirus funds directly to employees07.20.2020Source
19South China Morning PostCoronavirus: doctors blame Hong Kong’s outdated IT systems for slowing Covid-19 response, delaying reopening of mainland border05.31.2020Source
20New York TimesVirus-Tracing Apps Are Rife With Problems. Governments Are Rushing to Fix Them.07.08.2020Source
21The Washington PostApple and Google are building a virus-tracking system. Health officials say it will be practically useless.05.15.2020Source
22WiredState-Based Contact Tracing Apps Could Be a Mess05.27.2020Source
23World Economic ForumHow the new EU data strategy could affect trade and competition02.25.2020Source
24Chris Riley, MozillaA framework for forward-looking tech competition policy09.26.2019Source
25Shorenstein Center Policy PapersKey Elements and Functions of a New Digital Regulatory Agency02.12.2020Source
26U.S. House Judiciary CommitteeJudiciary Antitrust Subcommittee Investigation Reveals Digital Economy Highly Concentrated, Impacted By Monopoly Power10.06.2020Source
27STAT NewsWhat to expect from ‘Modernizing the FDA’s Data Strategy’ meeting06.29.2020Source
28Healthcare IT NewsFrost & Sullivan predicts nearly 2x growth in interoperability market by 202407.28.2020Source
29Agtech… So What? PodcastEp63 Evan Fraser on 3 barriers to agtech adoption and impacts of COVID-19 on agriculture03.25.2020Source
30IT NewsFarmers crack whip on data rights in agtech reality check01.22.2020Source
31HackernoonCross-Chain Interoperability : Enabling The Future of DeFi09.19.2020Source
32Inside Higher EdThe Babel Problem with Big Data in Higher Ed07.22.2018Source
33The AtlanticWhat Big Tech Wants Out of the Pandemic07.31.2020Source
34Marc Andreessen, Andreessen Horowitz BlogIt’s Time to Build04.18.2020Source
35STAT NewsWith Covid-19 halting clinical trials, wearables could be key — but data ‘wild west’ gets in the way 08.11.2020Source