Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

The command office has asked for a presentation on security mechanisms including

ID: 3663125 • Letter: T

Question

The command office has asked for a presentation on security mechanisms including access control models, authentication, authorization and encryption. Describe differences between them and identify situations where appropriate. Discuss how encryption relates to storage, network traffic, secure Web access, and the e-mailing of documents. One topic of concern is symmetrical versus public key.

The presentation should include the following information:

Comparison of security mechanisms

An explanation of public key cryptography

Historical background of the development of encryption

Typical uses of public key cryptography

Symmetrical versus public key

Encryption and how it relates to storage, network traffic, secure Web access and e-mail

Explanation / Answer

Access control mechanisms are a necessary and crucial design element to any application's security. In general, a web application should protect front-end and back-end data and system resources by implementing access control restrictions on what users can do, which resources they have access to, and what functions they are allowed to perform on the data. Ideally, an access control scheme should protect against the unauthorized viewing, modification, or copying of data. Additionally, access control mechanisms can also help limit malicious code execution, or unauthorized actions through an attacker exploiting infrastructure dependencies (DNS server, ACE server, etc.).

Authorization and Access Control are terms often mistakenly interchanged. Authorization is the act of checking to see if a user has the proper permission to access a particular file or perform a particular action, assuming that user has successfully authenticated himself. Authorization is very much credential focused and dependent on specific rules and access control lists preset by the web application administrator(s) or data owners. Typical authorization checks involve querying for membership in a particular user group, possession of a particular clearance, or looking for that user on a resource's approved access control list, akin to a bouncer at an exclusive nightclub. Any access control mechanism is clearly dependent on effective and forge-resistant authentication controls used for authorization.

Access Control refers to the much more general way of controlling access to web resources, including restrictions based on things like the time of day, the IP address of the HTTP client browser, the domain of the HTTP client browser, the type of encryption the HTTP client can support, number of times the user has authenticated that day, the possession of any number of types of hardware/software tokens, or any other derived variables that can be extracted or calculated easily.

Before choosing the access control mechanisms specific to your web application, several preparatory steps can help expedite and clarify the design process;

Try to quantify the relative value of information to be protected in terms of Confidentiality, Sensitivity, Classification, Privacy, and Integrity related to the organization as well as the individual users. Consider the worst case financial loss that unauthorized disclosure, modification, or denial of service of the information could cause. Designing elaborate and inconvenient access controls around unclassified or non-sensitive data can be counterproductive to the ultimate goal or purpose of the web application.

Determine the relative interaction that data owners and creators will have within the web application. Some applications may restrict any and all creation or ownership of data to anyone but the administrative or built-in system users. Are specific roles required to further codify the interactions between different types of users and administrators?

Specify the process for granting and revoking user access control rights on the system, whether it be a manual process, automatic upon registration or account creation, or through an administrative front-end tool.

Clearly delineate the types of role driven functions the application will support. Try to determine which specific user functions should be built into the web application (logging in, viewing their information, modifying their information, sending a help request, etc.) as well as administrative functions (changing passwords, viewing any users data, performing maintenance on the application, viewing transaction logs, etc.).

Try to align your access control mechanisms as closely as possible to your organization's security policy. Many things from the policy can map very well over to the implementation side of access control (acceptable time of day of certain data access, types of users allowed to see certain data or perform certain tasks, etc.). These types of mappings usually work the best with Role Based Access Control.

Authorization:

Authorization is the process of giving someone permission to do or have something. In multi-user computer systems, a system administrator defines for the system which users are allowed access to the system and what privileges of use (such as access to which file directories, hours of access, amount of allocated storage space, and so forth). Assuming that someone has logged in to a computer operating system or application, the system or application may want to identify what resources the user can be given during this session. Thus, authorization is sometimes seen as both the preliminary setting up of permissions by a system administrator and the actual checking of the permission values that have been set up when a user is getting access.

Data Encryption and Decryption:

Encryption is the process of translating plain text data (plaintext) into something that appears to be random and meaningless (ciphertext). Decryption is the process of converting ciphertext back to plaintext.
To encrypt more than a small amount of data, symmetric encryption is used. A symmetric key is used during both the encryption and decryption processes. To decrypt a particular piece of ciphertext, the key that was used to encrypt the data must be used.
The goal of every encryption algorithm is to make it as difficult as possible to decrypt the generated ciphertext without using the key. If a really good encryption algorithm is used, there is no technique significantly better than methodically trying every possible key. For such an algorithm, the longer the key, the more difficult it is to decrypt a piece of ciphertext without possessing the key.
It is difficult to determine the quality of an encryption algorithm. Algorithms that look promising sometimes turn out to be very easy to break, given the proper attack. When selecting an encryption algorithm, it is a good idea to choose one that has been in use for several years and has successfully resisted all attacks.

public key cryptography:

Asymmetric cryptography or public-key cryptography is cryptography in which a pair of keys is used to encrypt and decrypt a message so that it arrives securely. Initially, a network user receives a public and private key pair from a certificate authority. Any other user who wants to send an encrypted message can get the intended recipient's public key from a public directory. They use this key to encrypt the message, and they send it to the recipient. When the recipient gets the message, they decrypt it with their private key, which no one else should have access to. Witfield Diffie & Martin Hellman, researchers at Stanford University, first publicly proposed asymmetric encryption in their 1977 paper, New Directions In Cryptography. (The concept had been independently and covertly proposed by James Ellis several years before when he was working for the British Government Communications Headquarters.) An asymmetric algorithm, as outlined in the Diffie-Hellman paper, is a trap door or one-way function. Such a function is easy to perform in one direction, but difficult or impossible to reverse. For example, it is easy to compute the product of two given numbers, but it is computationally much harder to find the two factors given only their product. Given both the product and one of the factors, it is easy to compute the second factor, which demonstrates the fact that the hard direction of the computation can be made easy when access to some secret key is given. The function used, the algorithm, is known universally. This knowledge does not enable the decryption of the message. The only added information that is necessary and sufficient for decryption is the recipient's secret key.
In cases where the same algorithm is used to encrypt and decrypt, such as in RSA, a message can be securely signed by a specific sender: if the sender encrypts the message using their private key, then the message can be decrypted only using that sender's public key, authenticating the sender.

Historical background of the development of encryption:

Threats to computer and network security increase with each passing day and come from a growing number of sources. No computer or network is immune from attack. A recent concern is the susceptibility of the power grid and other national infrastructure to a systematic, organized attack on the United States from other nations or terrorist organizations.
Encryption, or the ability to store and transmit information in a form that is unreadable to anyone other than intended persons, is a critical element of our defense to these attacks. Indeed, man has spent thousands of years in the quest for strong encryption algorithms
Goodbye DES

AES is standardized as Federal Information Processing Standard 197 (FIPS 197, available here) by the National Institute of Standards and Technology (NIST), a non-regulatory federal agency. Prior to AES, the Data Encryption Standard (DES) became the federal standard for block symmetric encryption (FIPS 46) in 1977.

DES was based on an algorithm developed by IBM and modified by the National Security Agency (NSA). DES was considered unbreakable in the 1970s except by brute-force attack -- that is, trying every possible key (DES uses a 56-bit key, so there are 256, or 72,057,594,037,927,936 of them). By the late 1990s, however, it was possible to break DES in a matter of several days. This was possible because of the relatively small block size (64 bits) and key size and advances in computing power according to Moore's Law.

This achievement signaled the end of DES; although Triple DES, or DES repeated three times with different keys and therefore essentially a 168-bit key, is still acceptable for federal use until 2030.

Hello AES
In January 1997, NIST announced a competition for the successor to DES. To allay the suspicions that the NSA had placed "back doors" in DES, the competition was to be open and public, and the encryption algorithm was available for use royalty-free worldwide. The criteria included not only cryptographic strength (resistance to linear and differential cryptanalysis) but also ease of implementation and performance in software and hardware.

Over the course of three competitive rounds and intense cryptanalysis by the world's foremost experts on encryption, NIST selected the winner, the Rijndael (pronounced "Rhine doll") algorithm of Belgian cryptograhers Joan Daemen and Vincent Rijmen in October 2000. FIPS 197 was published on Nov. 26, 2001, and is the symmetric cipher of choice for government and commercial use today. Although originally approved for encryption of only non-classified governmental data, AES was approved for use with Secret and Top Secret classified information of the U.S. government in 2003.