A Journey Through Data Privacy: From Locked File Cabinets to the Cloud
Written by Jonathan Kass, Co-Founder & VP of Operations at Pyxos
Yet another breach letter?
As 2024 ended, I received yet another breach notice in mail. This one was from a company responsible for processing health care claims associated with my insurance carrier. They shared that their systems had been compromised, and that some details about me were likely included — oh, you know, just some small details like name, address, health conditions, and government identifiers.
And while I’ve been working in technology for my entire career — and have a technical understanding of breaches and compromises — my first thought was: “How is this still happening in 2024?! And how, in particular, does a healthcare company, one that should have been held to the highest standards for maintaining the privacy of my data, fail so completely?”
The beginning of my journey with data privacy [1986–1995]
I began my career as a hands-on software engineer working in the highly-security-conscious world of aerospace engineering on a team designing operating software for the international space station. In those days, desktop computers were relatively new, and security and confidentiality was primarily a question of working in locked building quadrants, and padlocking the really sensitive stuff on paper in file cabinets.
During this time, we were initially experimenting with this new thing called ‘DARPA NET’ — with our nascent desktop “lanman” networking, we were able to use a protocol called “TCP/IP” over the lanman network and open files on computers at other companies all over the country! Tools like ‘gopher’ and FTP were the leading edge of interconnected data transfer!
Most of our data at that time was on mainframe and mid-range servers which were safely walled off from broad access. But with the earliest configuration of multitasking desktop-based servers came a new kind of database, a “SQL-Server” that was going to be capable of democratizing data storage, access, and utilization in a world where the gold standard was still that old mainframe.
Into this purpose-built hardware world of specialized CAD/CAM workstations and powerful (at the time) bespoke minicomputers came what we’d eventually come to know as commodity hardware, which could host essentially limitless amounts of structured and unstructured data.
Little did we know in 1993 that with interconnectedness would come an astonishing amount of risk.
Next stop: highly private health data [1995–2005]
Flash forward to working at a health insurance company during the dawn of HIPAA — the Health Insurance Portability and Accountability Act — first signed into law in August of 1996, while Independence Day and Mission Impossible were redefining the blockbuster. While it would be a few years until the privacy rule was fully in effect (between 2003 and 2004), the federal standard sent a message to the healthcare industry to start making it easier for consumers to port their data from carrier to carrier, and to trust their data was being kept safe and secure.
When I made the transition from aerospace to business, and joined a specialty health insurance company focused on mental health and chemical dependency, our first project was moving the company from a mainframe health care platform to a very-ahead-of-its-time Windows-based client server platform using a SQL server back end.
Security had taken its proper place at the heart of what was ‘client-server’ computing by that point, with role based security built into both the front end application and the back end database itself. Security of the storage system was largely an operating system configuration concern, and we were applying the rule of ‘least privilege’ wherever we could to try to embed security right at the foundation when moving from mainframe to client-server.
When the migration was complete (and oh, what a project that was…), we were on a modern platform with tremendous flexibility to provide on-demand and scheduled reporting, apply upgrades and enhancements (relatively) seamlessly, and leverage our data in new and innovative ways. It was an exciting collaboration internally between business and technology leaders, led by our then Director of I.T., Kerry Matsumoto, and our COO Len Whyte, both of whom I was lucky to have as mentors and friends. It was also a terrific client-vendor collaboration, with what was then the Erisco Facets software team who were tremendously open and sharing about their architecture and roadmaps.
One of those points of leverage was the ability to enable integration between our specialty health company -and- larger medical health plans to supplement their medical benefits -with- our behavioral health services. It was at this moment of strategic business expansion that the Health Insurance Portability And Accountability (HIPAA), was initially released.
Suddenly it was not only important to secure data between health care companies which was a common practice though largely bespoke from partner to partner, there were now also regulatory standards for how that data was organized and transferred between entities with the goal of making it easier for consumers to switch from one health plan to another, and carriers to share data without so many proprietary solutions involved.
With Kerry and Len’s leadership, we quickly realized that there was an opportunity to create a competitive advantage in our bids for medical carrier partnership opportunities if we could demonstrate that our adherence to HIPAA would actually simplify the carrier integration process, rather than complicate it.
At the time, this meant scrambling together a ‘tiger team’ of data exchange experts, and getting them up to speed on how we could leverage the ERISCO platform to make things easy for the large medical carriers. This ‘make it easy’ mindset became part of our value proposition in proposals and a core part of our implementations. And it was my earliest experience in understanding that compliance could potentially be not simply a way to avoid fines, but something that could be integral to business development and growth.
Just as healthcare was grappling with HIPAA, another wave of compliance challenges was about to hit — this time, targeting financial institutions.
The dawn of retail privacy, PCI-DSS [2005–2015]
In 2005, I had the opportunity to join the US’s oldest and largest pet health insurance company in the United States, Veterinary Pet Insurance (VPI), which ultimately became Nationwide Insurance Company’s Pet Insurance division. The role was as CIO, and as a part of a ‘turnaround’ senior leadership team challenged with improving business operations and customer experience.
While it may sound a bit quirky, pet health insurance is a regulated excess-and-surplus insurance category in most states — a couple actually created a specific category to regulate pet health insurance directly as it grew as a category during this period (e.g. California, which continues to evolve its regulations — see https://www.gov.ca.gov/2024/09/26/governor-newsom-signs-pet-insurance-reform-bill-takes-action-to-support-animals-and-pets/). So regulatory compliance was nothing new for the company.
VPI was a relatively early adopter of online and phone credit card transaction processing, as the insurance was mainly sold directly to consumers. Applications had been built and customized to handle those transactions, along with somewhat complex interfaces to credit card processors to make everything work.
However, as 2004 came to a close, the credit card industry released its first compliance requirements — the Payment Card Industry Data Security Standards (PCI DSS) — and our little pet insurance company suddenly had a host of new requirements to adapt to.
The first release of the PCI DSS was pretty strict for its day— it required protection of all of the Personally Identifiable Information (PII) that could be used essentially to fraudulently charge a credit card — things like name, address, bank accounts, credit card numbers, expiration dates, and of course that 3 or 4 digit code on the back of the card that everyone is always asking you for! The standard had requirements for encryption and essentially creating a need-to-know environment where the only people who had access to PII had to demonstrate a business need for it to get their jobs done.
Businesses taking credit cards had to quickly put together a plan for compliance — among other penalties if they didn’t meet the standard, the bank card processors would no longer work with them! So it was an immediate and sustained priority to get and stay PCI-DSS compliant. Initially it was a heavy lift until the processors themselves provided tools that significantly reduced the exposure a business had taking payments from its customers.
But regardless of the approach — even after tokenization of credit card info became the gold standard (and every vendor had to sheepishly ask their customers for their card info again due to a “system upgrade” to establish the tokenization approach) — everyone was still subject to annual audits and attestations to prove compliance.
Which led to a realization: we could either begrudgingly address these annual audits as they arose, being mildly shocked every year at what was required of us, and derail projects to assign people to collect and collate the evidence required… or we could try a different approach, namely figuring out just how to conduct the business processes themselves so the audits were easy. Again making compliance ‘easy’ was the mantra, and soon we were sailing through the annual audits without any disruption to the day to day business, and without having to staff up a team to handle the work. That’s not to say it didn’t take some up front assessment of our then- current processes, and an investment in doing things a little differently to get different results. In the end we saved significant time annually and passed our audits consistently. It became an actual competitive edge.
And the (regulatory) beat goes on… [2016–2025]
As my career took me into the realm of B2C marketing and direct response advertising, yet another regulatory framework was making wave. This time, it was — starting in Europe. The General Data Protection Regulations (GDPR), which came into effect in May of 2018, was a direct response to the dangerous and breach- filled world we’d inadvertently started towards back in the early 90’s. Personal data was no longer protected behind health care or banking firewalls — now, after 20 years of the world wide web, and there was an explosion in e-commerce and online marketing, which meant personal data was spread to every corner of the internet. Webservers were caching email and address info; social media sites became farms for harvesting social graphs to illustrate all of your likes, dislikes, and buying behaviors. Even telco carriers became data brokers, monetizing meta data to enhance an increasingly intrusive data profile that consumers were barely aware of, let alone asked to consent to, was being maintained.
Into this chaos came GDPR, with the intent of bringing to light all the reasons and places your personal data was being used. It would solve all of this by forcing businesses to make it clear where, when, and why they were using your data. And you’d have the opportunity to control that usage.
Of course we know what came next — 1000’s of “check here to accept our use of cookies” banners on every website online launched to create evidence that consumers were opting in to all manner of uses of their private data, the vast majority of which never get read or reviewed. You could call this: “Compliance by overwhelming the consumer” with everything from cookie choices and privacy policies, to and default ‘opt-in’ or ‘opt-out’ language that most people never read.
But behind all those opt in requirements, companies had a host of new responsibilities. As privacy policies became geographic and fragmented, companies not only had to track what a consumer agreed to, they had to track and demonstrate that they knew who was seeing their content, and where (geographically) that person was when they saw it, in order to ensure they applied appropriate controls and consent requirements. Soon thereafter came the California Consumer Privacy Act (CCPA), with its own rules and regs, and a harbinger of a future where potentially the rules about what you can collect, store, and use about a consumer could be regulated at the state level in the U.S.
And yet, despite all of these regulations, and over 20 years of implementation if you just start with HIPAA, companies today still can’t seem to protect consumer information.
Which brings me full circle to that breach notice I received at the end of 2024 — again, it was a healthcare- related company, one of the most regulated entities around, and they simply “had a breach”, which they of course notified impacted consumers of, since that’s also required by regulations. No apologies for not protecting the data in a more rigorous manner; no promise to do better in the future. Just a notice and a coupon for one year of identity monitoring, so I can find out how many places my data starts popping up. Gee, thanks.
For me what this signals is we’re still in a world where many companies treat compliance as a cost and a burden, something they remain surprised they have to conform with, and which they spend the minimum necessary to pass an audit, without actually making the investments to make compliance a competitive advantage.
There has to be a better way.
Never one to lead with problems and offer no solutions, you can bet that I have some thoughts on ‘what a better way’ might entail. In the coming months, I’ll explore how companies can turn compliance from a burden into a business advantage — and why failing to do so is no longer an option.
Jonathan Kass is Co-Founder and VP of Operations for Pyxos, Inc., a generative-AI led compliance solutions company focusing on MENA privacy challenges. He has spent his career in technology and operations leadership roles in regulated industries including aerospace, healthcare, insurance, and marketing.