Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Network Security Controls ISC is an international manufacturing company with ove

ID: 3832014 • Letter: N

Question

Network Security Controls

ISC is an international manufacturing company with over 100 subsidiaries worldwide. ISC prepares consolidated monthly financial statement based on data provided by the subsidiaries. Currently the subsidiaries send their monthly reports to the ISC Corporate offices in Phoenix as pdf or spreadsheet attachments to e-mail files. The financial data are then transcribed by data processing clerks and entered into the corporate database from which consolidated statements are prepared. Because the data need to be reentered manually into the corporate system the process takes three to four days to enter all the data into the database. Also, the process is prone to transcription errors and other forms of clerical errors. After the data are loaded into the system, verification programs check footings, cross-statement consistency, and dollar range limits. Any errors in the data are traced and corrected. The reporting subsidiaries are notified of all errors via e-mail.

The company has decided to upgrade its computer communications network with a new system that will support more timely receipt of data at corporate headquarters. The systems department at corporate headquarters is responsible for the overall design and implementation of the new system. The system will consist of a central server at the corporate offices connected to distributed terminals at each of the subsidiary sites.

The new system will allow clerks at the subsidiary sites to send financial data to the corporate office via the Internet. The system will automatically load the financial data into the corporate database thus eliminating the error-prone data entry operation.

The company’s controller is pleased with the prospects of the new system, which should shorten the reporting period by three days. He is, however, concerned about security and data integrity during the transmission. He has scheduled a meeting with key personnel from the systems department to discuss these concerns.

Required:
a. Describe the data security and integrity problems that could occur when transmitting data between the subsidiaries and the corporate office.
b. For each problem identified, describe a control procedure that could be employed to minimize or eliminate the problem. (Use the following to represent your answer: problem identification and explanation & control procedure and explanation)

Explanation / Answer

a)Information security has become a visible issue in business, on the move and at home. Its practice places emphasis on preventing attacks that target availability (e.g., denial of service) and those that result in infections by malicious software (malware) that allow a third party to do unauthorized things with data and information (e.g., theft, disclosure, modification, destruction of data).

The Stuxnet worm reported in 2010 altered the operation of an industrial process and was designed to damage physical equipment and modify the operator’s monitoring indications to show that the equipment was working normally.1 This was an attack on data integrity (also referred to as a “semantic attack”) that, if and when replicated on other targets, could cause major problems in critical information infrastructures such as utilities, emergency services, air traffic control and others with a large IT component on which society relies. Data governance is an essential component for strengthening data integrity.

A recent article in the ISACA Journal presents a data governance framework developed by Microsoft for privacy, confidentiality and compliance. It discusses the roles of people, process and technology; the data life cycle; and the principles of data privacy and confidentiality. It also provides links to more detailed papers on the subject of trustworthy computing.2

Here, these topics will be expanded upon, focusing on data integrity, the standards and best practices that support it, and the role of data governance. This article also introduces a nonproprietary data governance framework.

Of the three main domains of information security, availability is closely associated with technology and lends itself to being measured. Downtime is visible and can be expressed as an absolute value (e.g., in minutes per incident) or as a percentage, and it is simple enough to understand that “five nines” (99.999 percent) availability means a total cumulative downtime of around five minutes in a year. Data center operators know what it takes to achieve this.

Confidentiality is easy enough to explain, but makes sense only if data and documents have been classified into categories that reflect the business need to protect them, such as “public,” “restricted to,” “embargoed until” and “secret.”

The technical people who provide IT infrastructure and services should not be expected to perform this classification, as they may not have enough business knowledge to do so and, through outsourcing and/or cloud computing, they may even be external to the business. Therefore, business functions must take ownership of the data and their classification process, while IT service and technology providers support this with tools and processes such as identity access management (IAM) controls and encryption.

The simplest metric for confidentiality is binary: An item that should not be disclosed either has not been (confidentiality is preserved) or has been (confidentiality is lost). Unfortunately, this is not a very useful metric, as it does not reveal the impact of such a disclosure, which can range from mild embarrassment to a breach of national security.

When it comes to integrity, the situation is more complex because the word means different things to different people. This creates fertile ground for miscommunication and misunderstandings, with the risk that the activity will not be done well enough because of unclear accountabilities.

The importance of data integrity can be illustrated simply: A person needs hospital treatment that includes taking a daily medication dosage of 10 milligrams (mg). By accidental or deliberate intervention, the electronic record of the treatment is changed to a dosage of 100 mg—with fatal consequences. In another example, what if, as in a work of fiction that predates the Stuxnet attack of 2010, the control systems of a nuclear power station are interfered with to show normal conditions while, in fact, a chain reaction has been triggered?3 Are professionals aware of the many definitions of “data integrity”? According to:

Accuracy and consistency of stored data, indicated by an absence of any alteration in data between two updates of a data record. Data integrity is imposed within a database at its design stage through the use of standard rules and procedures and is maintained through the use of error checking and validation routines.4

Quality of correctness, completeness, wholeness, soundness and compliance with the intention of the creators of the data. It is achieved by preventing accidental or deliberate, but unauthorized, insertion, modification or destruction of data in a database. Data integrity is one of the six fundamental components of information security.5

There is no doubt that there are more definitions to be found. But they have overlaps, address different issues and create semantic confusion, which is a likely reason for databases to be the least protected objects in the IT infrastructure.

This is not the end of the problem statement. The decentralization of information systems and the availability of powerful programming environments for end users, particularly spreadsheets, have created potentially uncontrolled integrity vulnerabilities because such spreadsheets are used to support executive decisions, possibly without due consideration of data quality and data integrity. How should this be counted? It could be considered as:

Perhaps it could be considered as all three, in which case it must be determined who (i.e., the data owner, the end user who designed the spreadsheet, the IT department or service provider, or all of them working together) should address these.

Triggers of Data Integrity Loss

The previous section used, as an example, untested and undocumented user-designed spreadsheets (aggravated by manual input, particularly when not assisted by validation of the values entered), but there are other, potentially more serious, triggers such as:

To complicate matters, the IT audit function may not have the critical mass to undertake audits covering all of these areas.

Attacks on Data Integrity

Attacks on data integrity involve intentional, unauthorized modifications of data at some point in their life cycle. For the purpose of this article, the data life cycle consists of:

Fraud is the oldest form of attack on data integrity, and it exists in many variants. The variants will not be discussed in this article, other than to mention an example that, in 2008, made page one in the world news: The “abuse of trust, forgery and unauthorized use of the bank’s computer systems” by a trader at Societe Generale (France) resulted in losses estimated at €4.9 billion.6 Judging from the number of publications and international conferences that deal with fraud, this issue is likely to remain high on the agenda for some time.

Web site defacements have affected many organizations in the private and public sectors for many years, but apart from some reputational damage, none could be considered as having been “catastrophic.”

Logic bombs, unauthorized software introduced into a system by one or more of its programmers/maintainers, or Trojan horses or other means can also impact data integrity through modifying data (as when a formula in a spreadsheet is incorrect) or encrypting data and then demanding a ransom to provide the decryption key. There have been several such attacks in recent years, mainly affecting hard drives in personal computers. It should be expected that attacks of this type will be launched against servers sooner or later.

Unauthorized modifications of operating systems (OSs) (server and network) and/or applications software (such as undocumented backdoors), database tables, production data and infrastructure configuration are also considered to be attacks on data integrity. It can be assumed that the findings of IT audits regularly include weaknesses in key processes, particularly the management of privileged access, change management, SoD and the monitoring of logs. These weaknesses make such modifications possible and hard to detect (until an incident occurs).

Another form of attack on data integrity is interference with Systems Control and Data Acquisition (SCADA) systems, such as those used by critical infrastructures (e.g., electricity, water supply) and in industrial processes. Frequently, these are not installed, operated or managed by the IT function. The attack on the Iranian uranium enrichment facilities in 2010 was designed to modify the behavior of the centrifuges while displaying normal conditions in the control panels.

It should be noted that many of these control systems are not connected to the Internet and, in the case of the injection of Stuxnet software, required a manual intervention, which confirms that “people” remain the weakest link in information security/assurance.

b)

IA is the practice of managing risks related to the use, processing, storage and transmission of information or data and the systems and processes used for those purposes. IA has grown from the practice of information security, which, in turn, grew out of practices and procedures of computer security.

Service providers (e.g., IT organizations, outsourcers) are clearly responsible for technologies and their operation and put measures in place to provide confidentiality, integrity and availability (CIA) in the operational environment. With regards to protecting data, they provide services such as backups and disaster recovery arrangements with clearly defined time and point recovery objectives (i.e., the amount of data lost in an incident) documented in service level agreements (SLAs). However, service providers do not have responsibility for data governance and its many related activities.

SLAs place clearly defined responsibilities on IT service providers, but not on data and system owners. This results in a lack of clarity related to accountabilities and, therefore, an inability to ensure that data have been properly classified and that the roles and responsibilities of data users and, in particular, privileged users are managed in a way that reflects their critical roles. As a result, data integrity remains the poor relation of information security and IA.

The Need for Data Governance

Data governance addresses specifically the information resources that are processed and disseminated. The key elements of data governance can be categorized into six major areas: data accessibility, data availability, data quality, data consistency, data security and data auditability. DAMA produced DMBOK,15 which presents a comprehensive framework for data management and governance, including tasks to be performed and inputs, outputs, processes and controls.

Conclusion

GIGO is as valid today as it was when it was first formulated some 60 years ago. The difference between then and now is that the volume of data in digital form has grown exponentially, and this growth has not been accompanied by the development and strengthening of data governance disciplines. The fact regarding CIA (the three pillars of information security) remains—that availability is the only component for which metrics are well defined and generally accepted.

Not applying data integrity metrics should be seen as an obstacle, because it implies that an enterprise cannot demonstrate that confidentiality or integrity are “better” or “worse” than before procedures and processes were introduced to manage them.

As long as data governance does not receive the same degree of attention as IT governance (and the latter often remains the weak link in corporate governance), organizations will be exposed to significant operational, financial, noncompliance and reputational risk.