Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Combine BLP and Biba Solution Bell LaPadula (secrecy):========================-

ID: 3631446 • Letter: C

Question

Combine BLP and Biba

Explanation / Answer

Bell LaPadula (secrecy):========================- can only read info at or below your level (obvious)- can only write info at or above your level-- why? if a top secret user can write data that userswho are merely confidential can read, the top secretuser may (inadvertently) leak secret information downto these lower levels...Biba (integrity):=================- can only read info at or above your level- can only write info at or below your levelInformation flow (e.g. from a variable x to a variable y) maybe explicit (y = x) or implicit:if ( x == 1 )y = 0;elsey = 1; combining Biba and BLP:------------------------ May use same "divisions" for both... that is, we have somesingle set of security levels (TOP SECRET, SECRET,CONFIDENTIAL, RESTRICTED, UNCLASSIFIED) and we use thatsame set for both Biba and BLP. And each user's Biba levelis the same as his BLP level.- May have different divisions for both;(a) "Biba has boundaries within BLP"- e.g., BLP has 3 levels, Biba has 5Biba BLP---- ---TS ---> TSSC OTHRU ---> UThen S, C, R from Biba all map to OTH in BLP.Then imagine a user who has Biba C and BLP OTH.From Biba C, can't write TS, Scan't read R, UFrom BLP OTH, can't read TScan't write USo when we compose those, we are left with:can't read: TS, R, Ucan't write: TS, S, USo... can read S, C and can write C, R ... so infectioncan only spread from the higher levels (e.g. S) to thelower levels (e.g. R). The opposite holds for BLP withinBiba. When Biba and BLP use the same divisions, theninfection can only spread within a particular level.This whole discussion is w.r.t. the areas of overlap..the 3 from Biba to the 1 from BLP.Low watermark policy (Biba)---------------------------For a set of inputs { N_1, N_2, ..., N_k } that produce someoutput O, the integrity of O is the MIN of the integritiesof each input.I(O) = MIN( I(N_1), I(N_2), ..., I(N_k) )where I(x) returns the integrity level of xThis makes intuitive sense... information is only as trust-worthy as the least trusted contributor of that info.Ring policy-----------Users cannot invoke everything they can read. This gets tothe different permissions for "read" vs. "execute" access.Practically, corresponds to having read permission on afile that you do not also have execute permission on.-- Attempts to make a distinction which cannot be madewith generalized information interpretationNB: Building a precise system for integrity is NP-completeas is doing the same for a precise system for secrecy.The most trusted programmer should have the highest securitylevel but under BLP other users will not be able toread/execute that programmer's programs. Then the mosttrusted programmer must have the lowest security level underBLP but then only other users with this classification willbe able to read/execute his programs...Point: If mix Biba and BLP, safe from viruses (they won'tspread beyond a partition) but no user can writea program that can be used throughout the system--> This seems worth revisiting--> As does the question of having a user who has differentBiba and BLP classifications... why should it beimpossible that a user's Biba classification is verylow and his BLP classification is very high...Biba TS, BLP U: then according to Biba user can onlyread other TS files and can write anything... butaccording to BLP, user can only read U files and writeeverything... net net is user can't do anythingCompartment policy==================Partition users into compartments; each user only able toaccess info req'd for his duties.If every user only has access to one compartment at a time,system is secure from viral propagation across compartmentboundaries.+ But in current systems, users can have simultaneousaccess to multiple compartments.---------------(3) Flow models---------------I think these are meant to be an alternative to the systems(such as Biba and BLP) that partition users into closedsubsets.========================(a) Flow distance policy========================- implement "distance metric" which is the # of sharingsover which this data has flowedRULES:------+ distance of output info is max of distances of inputs+ distance of shared info is one more than distance ofthat info pre-sharingEnforcement:------------+ Info has some distance threshold above which that infois considered unusable--> even by those with whom the information has alreadybeen shared? Or this only prevents future sharings?Example: A process P has distance 2; then P accesses a fileF which has distance 8. Then, according to the 1strule, P's new distance is 8. And according to the2nd rule, P's distance is incremented to 9.====================(b) Flow list policy====================For each object O, keep list of all users {U_1, U_2, ..., U_n}that have had an effect on O.- f_1, f_2, ..., f_n are the flow lists for U_1, U_2, ..., U_n- User U_1's flow list f_1 may look like: { U_1 }Then imagine that object O is affected by U_3, U_7, U_19then f_O = { U_3, U_7, U_19 }then if some user U_x wants to access O, we may restrict thataccess on the basis of U_x and O's flow list- e.g. U_x can only access info written by:(U_3 AND U_10) OR (U_7 AND U_10)which would mean that U_x cannot access O since that objecthas not been vetted by U_10RULES:------+ the flow list for the output is the union of the flowlists of all inputsENFORCEMENT:------------+ via Boolean expressions on flow listsCan use flow lists to implement the Biba and distance models.Limited transitivity systems----------------------------+ In systems with transitivity limited to a distance of 1, issafe to share info with any user you trust *without havingto worry* whether such a user has incorrectly trusted another.--------------------------(4) Limited interpretation--------------------------We have discussed how the generality of interpretation of mostcomputer systems (that is, their Turing capability) enablesviral propagation since data received by some system may beexecuted by that system in the way intended by the data'sauthor.Thus we may look at ways to restrict this "generality ofinterpretation" such that even if a virus is received onsome system, that system will not execute the virus.A draconian approach is this "fixed first-order interpretation"whereby no program on a system may be altered AND no inputto a program may influence that program's execution.- Clearly this is sufficient to prevent viral infectionsince a virus may not embed itself into any existingprograms AND may not be used as input to an existingprogram in a way that causes that program to behavedifferently.- Equally as clear, however, is the severely decreased utilityof such a fixed first-order system.So naturally the question turns to: how to limit the generalityof interpretation in a way that enables systems to still beuseful. What kind limits can be imposed? What does each suchlimit buy us (in terms of protection from virii)?- Open questions- We know that in order to infect, need certain operations,including the ability to write--> restricting the ability to write would result ininability to run most useful programs too howeverE.g., consider a system that only allowed "display file"functionality. No file may be modified; a file's contentsmay only be shown to the user.- For a database whose contents are static, this mayresult in a usable system- but not for development environments- May also work for mail systems which only need toshow the contents of mail to the user ... then wecan prevent infection FROM such (mail) applicationsto other applications via partitioning.We can write programs for "fixed interpretation schemes"(such as LISP and Basic) that perform viral infection.So even though the underlying interpretation scheme isnot infected (since it is static), that scheme may beused to interpret code that entails viral propagation.----------------------(5) Precision problems----------------------One could apply isolationism and limited transitivity inorder to constrain viral infection however the abilityto share data in a widespread manner is very useful.Limited transitivity entails tracing exact informationflow ... which requires NP-complete time. Maintainingtracing information (markings) requires a lot of space.So only isolationism is even practical to implement.