Regulatory Compliance Demystified: An Introduction to Compliance for Developers

 

Security Innovation, Inc.

March 2006

Summary: For a developer, understanding the issues around regulatory compliance can be a difficult and frustrating endeavor. This article makes sense of regulatory compliance from a developer's point of view. It examines Sarbanes-Oxley, HIPAA, and other regulations, and covers the most important best practices that are common across multiple pieces of legislation.

Contents

Introduction
Sarbanes-Oxley, Section 404
Health Insurance Portability and Accountability Act
Payment Card Industry Data Security Standard
Gramm-Leachy Bliley Act
SB 1386
BASEL II
Other Regulatory Constructs
Technology and Technique Best Practices
Conclusion

Introduction

Understanding the issues around regulatory compliance can be a difficult and frustrating endeavor for developers. Most developers do not have a legal background, and regulators generally do not have a background in software development. The result is a failure to communicate: The language and requirements described in legislation are not easy to pin to explicit software requirements. The problem is compounded by the growing diversity of regulations on a variety of levels — state, federal, and international — that now make up a patchwork of compliance requirements with sometimes overlapping applications. This document attempts to bridge this gap and make sense of regulatory compliance from a developer's point of view. We've spent the time reading and analyzing legislation so that you don't have to. While this document may not contain every detail you need, it should provide a good starting point to help you focus on the right areas to be successful in your compliance objectives.

This document covers six of the most relevant pieces of legislation in depth and then touches on four others more lightly. For each piece of legislation four areas are covered.

  • Summary of the Legislation: Explains the act from a developer's point of view, telling you what you need to know in order to understand its implications on your application development.
  • Required Process Steps: Explains in more depth which requirements are relevant to software developers. Generally speaking, this section describes what types of data are considered sensitive and how they need to be protected.
  • Technologies and Techniques: Explains strategies and techniques for meeting the legislative requirements. These are separated into five main categories: Confidentiality, Integrity, Availability, Auditing and Logging, and Authentication.
  • Additional Resources: Provides links where you can gather more information on the legislation in question.

Following the legislative sections is an important section titled, "Technology and Techniques Best Practices." In this section we've attempted to collect the most important best practices that are common across multiple pieces of legislation. For instance, you'll see confidentiality issues described in the HIPAA section; you can then go directly to the best practices section in order to learn more about how to keep sensitive data confidential.

You may not know which, if any, of the acts apply to you. The following table provides a brief description of each act and where it is applicable:

Act Applies to
Sarbanes-Oxley Privacy and integrity of financial data in publicly traded corporations.
HIPAA Confidentiality, integrity, and availability of health care information.
PCI Confidentiality of credit card information stored and used by merchants.
GLBA Confidentiality and integrity of personal financial information stored by financial institutions.
SB 1386 Confidentiality of customers' personal information stored by any organization that does business in the state of California.
BASEL II Confidentiality and integrity of personal financial information stored by financial institutions. Availability of financial systems. Integrity of financial information as it is transmitted. Authentication and integrity of financial transactions.

If your application fits within any of these categories, it is likely that compliance is already a major issue for you. If it's not, it will be soon. Compliance is an increasingly critical issue for the consumers of software, and these requirements are being driven into software development. Quite simply, it is an issue that, if ignored, threatens your business. The good news is that most of the regulations apply to IT managers and corporate business processes. As a developer, you need only worry about a subset of the requirements that enable your users to meet their compliance goals. The following sections describe in detail what this means.

Sarbanes-Oxley, Section 404

Summary of the Legislation

In the wake of corporate financial scandals like the Enron disaster of 2001, Congress passed what President George Bush characterized as "the most far-reaching reforms of American business practices since the time of Franklin Delano Roosevelt." The act, referred to as Sarbanes-Oxley (commonly abbreviated as SOX), was signed into law in 2002. Its purpose is to give investors more confidence in the financial reporting process of publicly traded companies by putting controls in place to ensure the confidentiality and integrity of financial data. The act applies to companies that are publicly traded in the United States, but has far-reaching international applicability because many large foreign companies are also traded on the U.S. stock exchange. The key part of the SOX act for developers is Section 404 titled "Management assessment of internal controls." This section requires management to take responsibility for the integrity of financial information by evaluating IT systems and processes and producing evidence that the company has done a reasonable job keeping sensitive information safe. While SOX doesn't address IT directly, the implications for IT are huge, given that most financial data in the organization flows through computerized information systems and the code that you write.

SOX has been a major driver for IT security. Among its impacts has been visibility into IT security at the highest levels of an organization. Section 302 of the Sarbanes-Oxley act requires that the CEO and CFO issue periodic statements certifying that adequate controls are in place for the control of financial information in the organization. Ignorance of a vulnerable system is no longer a defense because top executives now have to attest that proper controls are in place. The act also imposes stiff penalties for misrepresenting the state of controls and holds not just the organization but the CEO and CFO personally accountable. To be SOX compliant, companies must have regular external audits that assess the controls that are in place to ensure that data is accurate, unaltered, and offers a true representation of the company's financial position. SOX has driven significant spending on IT and IT security.

Required Process Steps

Corporations have been struggling with SOX compliance since its inception because there are no hard and fast rules or checklists companies can use to verify that they are actually "in compliance." Compliance to SOX has historically come down to the opinions within external auditor statements attesting that the proper controls are in place to ensure that financial data moves through the organization unaltered and is only exposed to the correct people in the organization. Part of the auditing process is to trace the flow of financial information inside the organization. For most companies, most of this information flow takes place through IT systems. This means that IT needs to provide assurance that this data:

  • Cannot be altered by unauthorized individuals.
  • Cannot be viewed by unauthorized individuals.
  • Is available when needed by authorized individuals.

It also ensures that any material changes to IT infrastructure that touch this data are documented and reported immediately to management.

Compliance comes down to implementing access controls on data that work and also ensuring that vulnerabilities in the system are patched and do not allow unauthorized modification or leakage. One commonly used framework to help IT comply with the needs of SOX is COBIT (Control Objectives for Information and Related Technologies), an open standard published by the IT Governance Institute and the Information Systems Audit and Control Association.

Technologies and Techniques

Fundamentally, Sarbanes-Oxley compliance comes down to an auditor's assessment of an organization's ability to restrict who has access to resources that manage financial data and what has changed in the IT environment. Prescriptively, developers need to address these needs in the applications that they write. For more information on each of these areas, see Summary of Best Practices below.

  • Confidentiality: It is a SOX requirement that confidential information cannot be exposed to unauthorized entities. Support the use of acceptable encryption technologies and algorithms to ensure that data is only divulged to authorized individuals. Properly implemented encryption helps drastically reduce the demands on intermediate systems that do not need to decrypt certain pieces of data and have no access to key material.
  • Integrity: Software needs to support evidence that data has not been modified using techniques such as cryptographic hashes and robust integrity checks.
  • Availability: One of the requirements of SOX is the availability of financial data to authorized individuals. This includes general code reliability, resistance to denial of service attacks, use of reliable data storage mechanisms (including the option of offsite storage and recovery), failover mechanisms (such as clustering), and mechanisms for cryptographic key storage and recovery in the case of system damage.
  • Access Controls: Developers need to support role-based access and revocation of accounts. The application may have to support this by integrating into larger identity management frameworks like LDAP. Review access rights that your application gives to any sensitive data. Ensure that when your application is installed database roles, file access, and all other points of data access are appropriately locked down to only those rules that have privileges to access them.
  • Auditing and Logging: One critical feature of IT controls is the auditing and logging of events in systems that process sensitive data. It is important to make sure that relevant system events are logged, such as shutdowns, restarts, or unusual events. This means that developers need to support providing those logs from their applications to administrators in a secure way. The second aspect to consider is that logs must not reveal any of the information that the system is trying to protect, thus potentially exposing it to unauthorized individuals. Carefully avoid logging any sensitive data — there is no guarantee that access to the logs and access to the sensitive data will require the same privileges.
  • Change Management: Change management is a critical part of Sarbanes-Oxley because the act specifically requires companies to notify the SEC of any material changes to the process that governs the flow of financial data. This is only applicable to you if you are writing a system that governs the flow of financial data and that can be configured in ways to change that flow. If this is the case, you need to support this requirement by offering the ability to log system changes in a way that those logs are resistant to tampering and accessible only to privileged users. A common attack that malicious users may employ to tamper with logs is to overwhelm the logging mechanism with millions of log events that either cause important information to be lost or obscure the important data. Guard against this by throttling your logging. You may also consider logging to a separate protected system to guard against attacks by unauthorized users.

Additional Resources

Health Insurance Portability and Accountability Act

Summary of the Legislation

The Health Insurance Portability and Accountability Act (HIPAA) was passed in 1996 by the U.S. Congress. It established federal regulations that force doctors, hospitals, and other health care providers to meet some baseline standards when handling electronic protected health information (ePHI), such as medical records and medical patient accounts.

Note   These standards are not meant to be a final set of objectives for health care providers. Rather, they are meant to serve as a starting point to make sure that all entities responsible for handling patient PHI are all following a standard, minimum set of rules to ensure there is some typical expected level of protection enforcement. It is also meant as the starting point for achieving more ambitious goals in the national health care system. The complete list of Health IT initiatives is available at https://www.hhs.gov/healthit/federalprojectlist.html#intiativestable

Before HIPAA was enacted, personal information on individuals that accumulated in various private databases were thought to be the property of the organization that owned the database. The major underlying concept of HIPAA is the notion that the owners of databases are not necessarily the owners of data contained therein — they are only intermediaries. This is a fundamental paradigm shift, because HIPAA compliant organizations must ensure that record owners are guaranteed (Health Insurance Portability and Accountability Act," https://en.wikipedia.org/wiki/HIPAA):

  • Access to their own records and the right to request corrections of errors.
  • Prior knowledge pertaining to how their information will be used.
  • Explicit consent from the involved individuals before ePHI can be used for marketing.
  • The right to ask and expect health organizations to take reasonable steps to ensure that communications between the individual and organization are kept private.
  • The right to file formal privacy related complaints to the Department of Health and Human Services (HHS) Office for Civil Rights.

Because the regulations need to be applicable across all of the various sized tiers of health care providers and services, the regulations themselves are purposely vague. The security provisions section of HIPAA is comprised of three different sets of requirements, each of which list specific safeguards:

  • Administrative Safeguards contain rules to establish and enforce company privacy policies and procedures (for example, disaster recovery and contingency plans).
  • Physical Safeguards encompass restrictions and rules that deal with physical access to facilities and machines, access controls, as well as the associated policies and procedures that deal with the physical entities in the organization.
  • Technical Standards contain all of the safeguards and practices that relate to the intangible information contained in the organizations computer systems such as intrusion prevention, data corroboration, and access controls. This paper will focus on the technical standards section because they contain most of the actionable items for developers.

Required Process Steps

There are several provisions of HIPAA, divided into two title sections. Title I deals with portability of health insurance coverage for employees and families when changing jobs, and was also created to ensure that employees with pre-existing medical conditions cannot be unfairly denied medical coverage. Title II contains the "Administrative Simplification" provisions and has three subcategories: privacy provisions, HIPAA Electronic Data Interchange (HIPAA/EDI), and Security Provisions (Health Insurance Portability and Accountability Act," https://en.wikipedia.org/wiki/HIPAA). The privacy provisions and HIPAA/EDI are not discussed here, since they primarily involve internal company policies and operating procedures (U.S. Department of Health and Human Services, "Administrative Simplification in the Health Care Industry, "https://aspe.hhs.gov/admnsimp/) that are beyond the scope of this document. The security provisions are the primary concern of developers since they contain specific technical compliance objectives.

The security provisions include:

  • Ensure confidentiality, integrity, and availability of all ePHI that the health care entity creates, receives, transmits, or maintains.
  • Prevent disclosure of any ePHI information that is not permitted or required.
  • Ensure that system information is available for auditing trails.
  • Ensure that authentication is in place so that specific workers or entities are who they say they are.

When analyzing the process steps, it's important to clarify the two types of specifications declared in the technical standards section of HIPAA:

  • Required Specifications are items that can be met uniformly across the entire standard, and across all organizations. For example, a sufficient level of encryption strength to ensure patient record privacy should be acceptable for any organization that strives for HIPAA compliance.
  • Addressable Specifications are subject to evaluation in terms of appropriateness based on circumstance and pragmatic considerations. For example, to meet the auditing requirements of HIPAA, organizations need to implement some kind of backup and storage system. But a small doctor's office does not need a solution on the same scale of magnitude as Massachusetts General Hospital. Addressable specifications give the organization a certain level of control over determining appropriateness for compliance. Addressable compliance items are not to be confused with being optional, they are still mandatory.

Technologies and Techniques

There are several specific technologies that are readily available and can practically meet the needs of the aforementioned process steps. For more information on each of these areas see the Summary of Best Practices below.

  • Confidentiality: All ePHI must be kept confidential to prevent unauthorized parties from accessing a record owner's account. Use appropriately strong encryption when storing confidential data in databases or files, when transmitting over a network, and when the data is in memory. Design software so that the encryption algorithms are replaceable and key sizes can be easily increased so that encryption strength can keep pace with advances in computing power and cracking algorithms.
  • Integrity: Records should not be modifiable by unauthorized people or entities. Developers need to apply the principles of least privilege and perform meticulous error handling to minimize the risk of privilege escalation. All sensitive information should use an integrity checking mechanism such as HMAC-SHA1 or a digital signature to limit the risk of information tampering.
  • Availability: Since record owners are guaranteed the right to access their own records, lack of service availability could result in a HIPAA compliance violation. Developers need to design systems to properly handle errors and withstand denial of service attacks. Event logs should contain enough information to make it possible to reconstruct system activity up to the point of failure so that the error can be quickly resolved and fixed.
  • Auditing and Logging: Any actions that might need to be traced must be documented. Software systems must generate all of the necessary logging information to construct a clear audit trail that shows how a user or entity attempts to access and utilize resources. Make sure that these logs are backed up regularly to ensure that auditing information is not lost due to system failure.
  • Authentication: To securely operate on ePHI, it is necessary to know that the entity or person that is working with the data is legitimate and authorized to access said information. Permissions are a set of mappings that correlate the identity of a person or entity to a specific set of permissible operations. Authentication systems should be designed with clear roles mapped to permissions so it easier for developers to implement and testers to expose abuse cases. When setting up authentication systems, be sure that the system enforces strong passwords out of the box.

Additional Resources

Payment Card Industry Data Security Standard

Summary of the Legislation

The Payment Card Industry (PCI) Data Security Standard is a standard based on Security programs that were independently created by four separate card services, namely VISA Account Information Security program (AIS) and its affiliated Cardholder Information Security Program (CISP), MasterCard Site Data Protection program (SDP), American Express Security Operating Policy (DSOP), and Discover Information Security and Compliance (DISC).

PCI establishes a comprehensive set of worldwide security standards for all merchants and service providers that deal with the storage, transmission, or processing of cardholder data from any major card service. Compliance with the Payment Card Industry (PCI) Data Security Standard applies to merchants and service providers in all payment channels, including retail stores with physical presence, mail and telephone orders, and e-commerce. Its scope covers systems, policies, and procedures (Interactive Media in Retail Group, https://www.imrg.org/IMRG/Library.nsf/0/ECB04B83B24D62D180256FEA00439EAF?OpenDocument).

Required Process Steps

To certify official compliance with PCI, an audit must be completed in order to verify that the standards are adequately being met. Quarterly and annual audits for compliance follow thereafter, with the exact requirements being dependant on the merchant's classification — either Level 1, 2, 3, or 4. All merchants are partitioned into different levels based on the volume of credit card transactions they average on a yearly basis Interactive Media in Retail Group, details on selection criteria, https://www.imrg.org/pci_compliance_uk.pdf). The more transactions an organization processes, the more critical it is that your organization meets the PCI standards to manage the greater risk. This partitioning is an attempt to reconcile the difficulties associated with balancing security with practical overhead. There are several broadly defined goals that software developers in merchant organizations need to follow to achieve PCI compliance (Interactive Media in Retail Group, specific details on PCI data security standard process steps, https://www.imrg.org/pci_compliance_uk.pdf):

  • Protect cardholder data.
  • Implement strong access control measures.
  • Regularly monitor and test networks.

Unfortunately, the abilities of a merchant to track privacy violations and enforce corrective measures is drastically hindered through legislation, and through technical matters that stem from the fact that the forensic evidence to undertake such actions is incomplete to all merchants. A defensive stance is really the only option for a merchant that must develop code that operates on credit card data.

Technologies and Techniques

The following strategies are applicable to PCI compliance. For more information on each of these areas see Summary of Best Practices below.

  • Confidentiality and Authentication: This is the central goal of PCI — to protect consumer credit card information. Unfortunately, developers do not have complete, end-to-end control over the credit card handling process. In an ideal world, credit card data would be exchanged directly between the credit card company and the individual, but this is rarely the case. Credit card data often proceeds through several intermediary agents, some of which consumers cannot see, but all of which they are forced to implicitly trust. The best solution that can be implemented from a single organization's perspective is to ensure that the data is properly encrypted, and that only authorized systems or agents have access to sensitive account information. It is not an effective end-to-end solution, but it will at indemnify the company from costly penalties that can be incurred in the event of privacy violations and disclosures.
  • Logging and Auditing: This is the second most critical aspect of PCI compliance. Developers need to ensure that their code provides hooks for logging all pertinent account transactions and accesses. Unfortunately, it is very difficult for merchants to proactively determine instances of credit card fraud, because they do not possess a centralized bank of behaviors that indicate what usage patterns are suspect for a given account. Behavior monitoring is a task left to the credit card issuing and tracking agencies. From a developer perspective it is important to ensure that this information is logged, so that internally developed applications will not cause accountability gaps when federal or state agents request backlogged data.

Additional Resources

Gramm-Leachy Bliley Act

Summary of the Legislation

The Gramm-Leachy Bliley Act (GLBA) was passed by the Senate in November, 1999 to facilitate industry wide financial services reform. It was proposed by Senator Phil Gramm of Texas to improve the competitive practices in the financial services industry by providing a framework for the affiliation of banks, security firms, and other financial service providers (Library of Congress, "Thomas," https://thomas.loc.gov/cgi-bin/bdquery/z?d106:SN00900).

The GLBA sought to "modernize" financial services — that is, end regulations that prevented the merger of banks, stock brokerage companies, and insurance companies. The removal of these regulations raised significant risks that these new financial institutions would have access to an incredible amount of personal information, with no restrictions upon its use (Electronic Privacy Information Center, Chris Jay Hoofnagle and Emily Honig, "Victoria's Secret and Financial Privacy," https://www.epic.org/privacy/glba/victoriassecret.html). While the act itself does not specify explicit requirements for the privacy and protection of customer information, there are three provisions that constitute the privacy requirements of the GLBA: the Financial Privacy Rule, the Safeguards Rule, and the pretexting provisions.

The GLBA gives authority to eight federal agencies and the states to administer and enforce the Financial Privacy Rule and the Safeguards Rule. The financial privacy and the safeguards rules apply to financial institutions, which include banks, securities firms, insurance companies, and companies providing other types of financial products or services to consumers. Among these services are lending, brokerage or servicing of any type of consumer loan, transferring or safeguarding money, preparing individual tax returns, providing financial advice or credit counseling, providing residential real estate settlement services, or collecting consumer debts. The GLBA specifically names company CEOs and directors as personally accountable for any misuse of personally identifiable information. Failure to comply with the GLBA incurs regulatory fines for the financial institution in violation.

Required Process Steps

The GLBA allows closer ties among banks, securities firms and insurance companies, with the restriction that financial institutions and their partners are required to protect non-public personal data and to implement a variety of access and security controls. The main concern of the GLBA is ensuring the integrity and confidentiality of customer records and information. A customer's personal financial information consists of name, address, social security number, account number, and any other information a customer provides on an account application. During requirements gathering, implementation, testing, and deployment, it is important to keep these integrity and confidentiality objectives in mind. By addressing the issue during every phase of the development lifecycle it is much less likely that a problem will be missed.

Technologies and Techniques

The following strategies are applicable to GLBA compliance. For more information on each of these areas, see Summary of Best Practices below.

  • Confidentiality: All customer information must be kept confidential to prevent unauthorized parties from accessing a customer's account. Developers must use strong encryption and hashing, but must also ensure that the routines used to handle encryption, decryption, and signing are industry approved — do not ever use custom cryptographic routines. Ensure that encryption routines are modular so that they can be replaced with minimal expense. Do not rely on untrusted encryption libraries. They may lack cryptographic rigor, or contain vulnerabilities that compromise the ciphers.
  • Integrity: Records should not be modifiable by unauthorized people or entities. Developers need to apply the principles of least privilege and perform meticulous error handling to minimize the risk of privilege escalation. In software, the less commonly executed error handling routines are the places where unexpected behaviors reside that can lead to privilege escalation or result in unexpected failure. All sensitive information should use an integrity checking mechanism such as HMACSHA1, or a digital signature to limit the risk of information tampering.
  • Auditing and Logging Any actions that might need to be traced must be documented. Software systems must generate all of the necessary logging information to construct a clear audit trail that shows how a user or entity attempts to access and utilize resources. Make sure that these logs are backed up regularly to ensure that auditing information is not lost due to system failure. Do not place confidential information in your logs; it is possible that the unauthorized release of customer information may come from individuals with legitimate access to such records.
  • It is important to analyze the access patterns of users because illegitimate data use will usually be uncovered by looking at usage patterns that would have gone unnoticed if the data access were all done through legitimate channels. For example, consider access times. If a bank exchange keeps normal business hours, and a series of transactions occur on specific accounts at odd hours, this could be indicative of a breach. Given that each institution has its own set of business rules; developers and operations managers will be able to expand these suspect behaviors to get a more specific idea of what to look for. From a developer's perspective, thinking about the types of information that can be trapped to facilitate this behavior-based approach will save time and effort in the long run.

Additional Resources

SB 1386

Summary of the Legislation

SB 1386 is a California State bill that amended existing privacy laws to include stipulations requiring disclosure of privacy violations by any organization or individual that maintains personal information about customers and does business in the state of California. Personal customer information is officially defined in the bill as containing a customer's last name and first name or first initial, along with at least one of the following pieces of information (LegalArchiver.org, "California SB 1386," https://www.legalarchiver.org/sb1386.htm):

  • Social security number.
  • Driver's license number or California Identification Card number.
  • Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual's financial account.

At least one of the above fields must be available in unencrypted form to constitute a privacy violation (for example, if any or all of the customer's records are leaked, but all fields are encrypted, it is not considered a breach). If all of the personal information leaked is publicly available through other sources, it is not considered a privacy violation. The enforceable provisions of the SB 1386 legislation went into effect on July 1, 2003.

SB 1386 is an example of how states are taking privacy considerations into their own hands, and taking steps to protect the private information of their residents.

Required Process Steps

SB1386 is a very straightforward, tightly constrained set of rules focused on ensuring that the confidentiality of customer information is preserved. The narrow scope and lack of ambiguity in the legal wording has given regulators significant power in their ability to enforce SB 1386 in a variety of cases. There have already been many documented cases where companies have been found to be in violation. (See the chronology of data breaches in the reference section that follows for specific instances of enforcement.)

For practical intents and purposes, this bill has two specific compliance steps:

  1. Ensure privacy of customer data at all costs.
  2. Disclose all cases where personal information that meets the previously mentioned criteria has been reasonably suspected of being improperly disclosed or acquired by an unauthorized person or entity. Disclosure must be provided directly to the affected individuals and must be carried out in a timely manner. The only provisioned caveat to the timely disclosure of privacy breaches is that it must be determined that the notification of given data breach does not impact any active or pending criminal investigations.

Technologies and Techniques

The following strategies are applicable to SB 1386 compliance. For more information on each of these areas see the "Summary of Best Practices" section below.

  • Confidentiality: Ensuring the privacy of customer data is one of the primary goals behind the SB1386. Careful examination of the legislation in section II, paragraph (e) reveals a specific provision: "For purposes of this section, 'personal information' means an individual's first name or first initial and last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted." This wording goes so far to say that if an attacker obtains the data, it is not necessarily a breach unless one field is not adequately unencrypted. When storing any customer data, encryption must be employed to ensure compliance.
  • Auditing and Logging: One of the most technically demanding aspects of complying with the SB 1386 full disclosure provision is detecting when privacy breaches occur. Dedicated attackers that are looking to steal information will not do any damage to the systems they compromise. Lack of any noticeable damage means that the more obvious signs of intrusion that would typically alert investigators to the presence of wrong doing are not there. The only way to effectively deal with intrusions is to ensure that every transaction is logged and employ irregular but consistent auditing either through manual review or through a behavior based analysis engine. Developers should be mindful of this information and provision for the necessary hooks in their code so that all relevant transactions are tracked and accounted for.

Additional Resources

BASEL II

Summary of the Legislation

BASEL II is officially known as the International Convergence of Capital Measurement and Capital Standards. It is a framework established by the Basel committee, a consortium of Central Governing Banks from several countries. The purpose of BASEL II is to revise the existing international standards used to measure the viability of a bank's capital. The previous BASEL accord is considered to be out of date, because there are several realities of modern banking that are not adequately reflected in the regulations. For example, the original BASEL accord does not take into account arbitrage across markets.

Note   Essentially arbitrage means that inequalities and imbalances in different markets must be considered when evaluating the value of internationally active financial institutions. This is especially true for international banks, because their estimated value is largely dependant on equity that is backed through international markets.

Required Process Steps

Most of BASEL II is worded for banking professionals. Given that BASEL II is an international standard, it is written in such a way that it may be applicable to a variety of banking systems worldwide. These realities are justifications for why the requirements are quite vague from an IT perspective, at least in terms of actionable compliance targets. However, there is a substantial collection of information available that describes risks and mitigation steps for electronic banking and financial services. This is a compilation from a few of the sources that have been deemed most relevant to BASEL II from a developer's perspective. For a complete list of all referenced sources, see the accompanying section that follows.

While this is not a comprehensive list, the following process steps will help ensure BASEL II compliance:

  • Prevent improper disclosure of information.
  • Prevent unauthorized transactions from being entered into the computer system.
  • Prevent unauthorized changes to software during routine development and maintenance that allow fraudulent transactions to be generated, leave certain kinds of operations unchecked, or disable logging with the purpose of bypassing auditing so that actions may proceed unnoticed.
  • Prevent the interception and modification of transactions as they are transmitted through communications systems such as telephone, data, and satellite networks.
  • Prevent interruption of service due to hardware or software failure.

Technologies and Techniques

The following strategies can help ensure the above process steps are satisfied.

Note   Most of the techniques to ensure process compliance could be in the form of customers acting as agents that directly interact with banks, or it could be a third-party transaction from one financial institution to another.

For more information on each of these areas see Summary of Best Practices below.

  • Confidentiality: Financial data must be kept confidential and unmolested at all costs. Developers need to apply the principles of least privilege and perform meticulous error handling to minimize the risk of privilege escalation or result in unexpected failure. In software, the less commonly executed error handling routines are the places where unexpected behaviors reside that can lead to privilege escalation. Ensure that encryption routines are modular so that they may be replaced with minimal expense. Do not rely on un-trusted encryption libraries because they may lack cryptographic rigor, or contain vulnerabilities that weaken or compromise the ciphers.

  • Availability: Before banking systems were tied together so intricately, the standard operating procedure during computer downtime involved reverting to the manual processes that were in place before computers were integrated into the system. However, reliance on the conveniences that such an automated system provides has rendered the option of reverting to manual calculation wholly impractical. As a result, computer downtime will inevitably result in losses, so there need to be contingencies in place to handle unexpected events. Every system needs to be at the very least doubly redundant. Failover should be immediate and automated for all of these systems. Availability is usually stressed as a responsibility of system administrators and IT professionals, but developers can indirectly affect the availability of a system by improving portability of the applications they develop and ensuring reliability and robustness through good coding practices.

    Note   For large mission critical systems, some applications are deployed across a heterogeneous environment to increase variance and hopefully improve the chances that conditions contributing to failure of one system may not affect a different portion of the operating environment. Consider, for example: A Web server could be deployed on Microsoft Windows NT running IIS, and another Web server could be deployed on Red Hat Linux running Apache Web server. If an Internet worm brought down the Linux server, it might not affect the Windows server. In this case, the variance in the environment affords some level of protection in much the same way that the principal of genetic variance presumably works in biological systems.

  • Change Management: It is important that developers be held responsible for changes that are made to software systems. Modifications to bank software that allow fraudulent transactions to be executed have not been perpetrated by viruses from external sources, but disgruntled developers or employees. Accountability measures need to be put into place that prevents people from inserting arbitrary changes to critical bank software systems. Code reviews should be conducted frequently by teams of developers and testers to ensure changes are justified. Source control systems should be in place that prevents unauthorized changes, ideally by enforcing some kind of access control lists, or at the very least, should flag recent changes for review so that a clear auditing trail can be constructed in the event of suspected wrong doing.

  • Authentication: It is necessary to ensure that the transactions that take place within the banks are executed solely through legitimate agents. Authentication is the primary means of ensuring that agents are who they claim to be. When setting up authentication systems, be sure to enforce strong passwords. Do not enable or provision for "guest" accounts or other means of access that do not correlate a specific identity with the accounts or resources being utilized. When storing passwords somewhere, be sure to use sufficiently strong encryption to protect the credentials from attackers that might gain access to them. Employ technologies such as SSL and digital certificates. Enforce password policies that will expire credentials on a semi-regular basis.

  • Make sure that inactivity causes automatic log-off when a user is idle for a specific period of time. When allowing users to access account information remotely over an insecure network, such as the Internet, consider using a two-factor authentication system to lessen the likelihood that account identities are stolen. Consider using nonstandard, nonconsecutive user identifiers. For example, many banks allow users to log on using their social security numbers as user names. This is not a good idea, because the social security number is itself a sensitive piece of information and should not be exposed. It is also considered immutable except in unique cases, and in the event that a social security number is exposed, no action can be taken to change or hide the user's account credentials to prevent further tampering.

Additional Resources

Definition and examples of economic arbitrage: https://economics.about.com/cs/finance/a/arbitrage.htm

BASEL II standards: https://www.bis.org/publ/bcbs118.htm

Risk management for electronic banking and electronic money activities: https://www.bis.org/publ/bcbs35.pdf

Risks in Computer and Telecommunications Systems: https://www.bis.org/publ/bcbsc136.pdf

Chronology of required data breach disclosures since Feb 2005: https://www.privacyrights.org/ar/ChronDataBreaches.htm

Other Regulatory Constructs

Summary of the Legislation

This section provides a brief summary of two pieces of legislation and a standard that are less likely to impact your application development.

Federal Information Security Management Act

The Federal Information Security Management Act (FISMA) was passed into law in 2002 in order to mandate a set of federally recognized information security standards. The act states that Federal Information Processing Standards (FIPS) compliance is mandatory for all government agencies — if you sell software into any branch of the federal government, it must be compliant. Like many of the other acts described, FISMA focuses on the confidentiality, integrity, and availability of sensitive information. The act contains guidelines that can be used to measure the potential impact of a breach and therefore determine the level of protection necessary.

You can find more information at https://csrc.nist.gov/sec-cert/.

BS 7799

BS 7799 (guidelines for information security risk management) is a set of recommendations that is likely to become an ISO standard in the near future. It is a comprehensive set of standards covering:

  • Risk assessment.
  • Risk treatment.
  • Management decision-making.
  • Risk re-assessment.
  • Monitoring and reviewing of risk profile.
  • Information security risk in the context of corporate governance.
  • Compliance with other risk-based standards and regulations.

The purpose of BS 7799 is to help an organization establish a comprehensive information security policy.

You can find more information at https://www.thewindow.to/bs7799/index.htm.

EU Data Protection Directive

The EU Data Protection Directive was put into place in October 1998. It defines a standard by which protection of sensitive data can be judged, and prohibits the transfer of this data to any nation that does not meet the standard. Due to complexities of the law and subsequent interruptions to business of many U.S. companies, in July 2000 the EU passed the Safe Harbor Act in order to streamline the process. Safe Harbor provides a framework by which you can ensure the adequacy of privacy protection in your company. Many of the adequacy requirements are procedural. Those that impact application development fall into the familiar buckets: confidentiality, integrity, availability of sensitive data.

You can find more information on Safe Harbor at https://www.export.gov/safeharbor/index.html.

For more information on the EU Data Protection Directive, go to https://www.cdt.org/privacy/eudirective/EU_Directive_.html.

Technology and Technique Best Practices

Summary of Best Practices

Many of the technologies and techniques described for the regulations above are similar. The sections above describe each area as it applies to the specific legislation. This section goes into more detail describing key best practices for each area.

  • Confidentiality: Do not rely on custom or untrusted encryption routines. Use OS platform provided cryptographic APIs, because they have been thoroughly inspected and tested rigorously. Use an asymmetric algorithm such as RSA when it is not possible to safely share a secret between the party encrypting and the party decrypting the data. A symmetric algorithm such as AES can be used when it is possible to share a secret before the encrypted communication begins. It is important to choose an appropriately sized key so that the encryption is not easily broken. When using RSA, choose a 2048 bit key. When using AES, choose a 128 bit key. These key sizes give some room for growth but will eventually need to be replaced with larger keys as computing power increases.
  • Integrity: Hashing should be used to store confidential information, such as passwords, that may need to be validated but won't need to be retrieved in whole form. Integrity checks should be used to ensure that confidential data has not been tampered with. Use SHA1 when hashing and use HMAC-SHA1 when conducting integrity checks. It is important, however, to keep in mind that the hashing algorithm may need to change over time as computing power increases and previously strong algorithms fall out of favor.
  • Availability: Data availability includes the use of storage solutions that are reliable such as RAID and offsite backups. It is also important to make sure that these backups can be secured. For developers, this means that applications should be designed so that key information is archiveable and recoverable even if there is a terminal failure in the system. Where encryption is used, this means having a mechanism to export keys so that data can be recovered if the system is destroyed. Another important consideration for developers is secure failover of the system. This means that error handling code must not weaken the security of the system and, where appropriate, it should support clustering and safe failover. Error-handling in particular is a common point of failure. When designing and implementing your error handling mechanisms, keep the following guidelines in mind:
    • Create a consistent error-handling architecture and use it throughout your application.
    • Do not reveal sensitive information to the user, or the caller, when your application fails.
    • Catch exceptions or error return values on every API call that can provide such information.
    • Don't fail to an insecure state, make sure you clean up your resources appropriately, and take care of any sensitive data that may be in memory or on the file system.
  • Auditing and Logging: Carefully avoid logging any sensitive data — there is no guarantee that access to the logs and access to the sensitive data will require the same privileges. If you are involved in system deployment, ensure that the event log is only exposed to administrators. Even if no private information is logged, the information contained in the log can be used to further attacks on the system. If you are using ASP.NET 2.0 to write your application, you can use the new health-monitoring feature to instrument your application and capture relevant information. For other .NET applications on the Windows platform, you can use the System.Diagnostics.EventLog class. For unmanaged applications you can use the ReportEvent API within the Windows Platform SDK. A common attack that malicious users may employ in order to tamper with logs is to overwhelm the logging mechanism with millions of log events that either cause important information to be lost or obscure the important data. Guard against this by throttling your logging. You may also consider logging to a separate protected system to guard against attacks by unauthorized users. Event logs should contain enough information to make it possible to reconstruct system activity up to any point of failure so that the error can be quickly resolved and fixed.
  • Authentication: When setting up authentication systems, be sure to enforce strong passwords. At a minimum, require passwords to be seven characters long and contain at least one non-alphanumeric character. Do not enable or provision for "guest" accounts or other means of access that do not correlate a specific identity with the accounts or resources being utilized. When storing passwords, be sure to use sufficiently strong encryption to protect the credentials from attackers that might gain access to them. Enforce password policies that will expire credentials on a semi-regular basis. Make sure that inactivity causes automatic log-off when a user is idle for a specific period of time. If you are writing an ASP.NET application you can set the sliding Expiration configuration setting to true in order to expire user sessions in a set amount of time.

Additional Resources

The following resources provide a great source for best practices in a variety of areas:

Conclusion

Understanding compliance can be difficult. There is a wide variety of regulations, and information on them is scattered across many locations. There are very few resources mapping regulatory requirements to software development requirements or impacts on the software development process. However, the difficulty in understanding the legislation does not lessen its importance. These regulations are being actively enforced through litigation in federal and state courts. Judges and juries are unsympathetic toward the difficulties of compliance. Developers and companies should be worried about the ramifications of noncompliance in a court of law, as well as in the court of public opinion when people feel their privacy has been violated.

The regulations themselves may be complex, but meeting their requirements doesn't have to be. It boils down to the following critical steps:

  • Identify which regulations are important requirements for your industry and for the specific application you are developing.
  • Ensure the requirements are part of your formal development process from requirements analysis through design, implementation, testing, and deployment.
  • Follow best practices as appropriate in the areas of confidentiality, integrity, availability, auditing and logging, and authentication.

The Windows development platform, and especially the .NET framework, makes it particularly easy to use cryptography, data integrity features, and user authentication services in order to meet best practices. The following list provides a summary of the key best practices:

  • Use approved industry standard cryptographic algorithms (such as RSA with 2048 bit key) to protect the confidentiality of sensitive data.
  • Use integrity checks (such as HMACSHA1) to ensure the integrity of sensitive data.
  • Use a well-architected error-handling framework to ensure the availability of sensitive data.
  • Use event logging or HealthMonitoring to ensure that modification and usage of sensitive data is auditable.
  • Use platform provide authentication services to verify the identity and role of any user trying to access or modify sensitive data.
  • Consider the use of two-factor authentication to provide an extra level of confidence in the identity of the authenticated user.

Use the information presented in this paper to plan your strategy for compliance and as a starting point for a more in depth investigation into how your development processes may need to adapt to this changing environment.