In order to design and implement a system securely, Saltzer and Schroeder define 10 guiding design principles in their famous and one of the most cited article “The Protection of Information in Computer Systems”. Our article reviews these principles with key takeaways. Show How to Develop Secure Systems: 10 Design Principles by Saltzer and Schroeder for Secure System DevelopmentSaltzer and Schroeder’s 1975 article “The Protection of Information in Computer Systems” (One of the most cited works in Computer Security history) outlines 10 fundamental design principles for developing secure systems, whether hardware or software. Though published at a time when only mainframe computers were in use with no interconnectivity among them, these principles still apply in today’s modern computing world where personal computers and ubiquitous smart devices communicate with each other on the Internet. At the heart of these principle lies two basic tenets: simplicity and access control. Simplicity favors easy to understand designs that result in systems with fewer inconsistencies. Access control on the other hand, mediates each transaction to allow only the authorized parties to get access to the resources. In this article, we review these principles with key takeaways (bulleted list items as excerpts from the article) to increase the awareness on secure system development. Principle 1. Economy of MechanismThis principle favors simplicity over complexity as it it more likely to have security vulnerabilities in systems that are complex in their design and implementation. Embracing and striving for simplicity allows for systems that are easier to test, validate and maintain. For this reason, apply the well known mantra Keep It Simple Silly (KISS) for enhanced security. Key takeaways for this principle are:
Principle 2. Fail-Safe DefaultsIn computing systems, the default access right should be “no access”. In other words, access rights should be managed individually with “allow” rights (whitelisting) leaving the default at “deny”. This is both easier to manage and leaves the system at a secure state if the security mechanism fails. The opposite of this principle, allowing by default and denying on individual cases (blacklisting), is a very dangerous security malpractice and should be avoided. Firewall and file access configurations are two of the classical examples where this principles should be applied. Key takeaways for this principle are:
Principle 3. Complete MediationThis principle mandates that access rights are completely validated every time an access occurs. A common malpractice that violates this tenet is checking access rights initially and relying on this initial cached data for determining the access rights for the following access requests. For instance, in some operating systems, file permissions are checked when a file is opened and a file handler is created for checking the following access request. If access rights are checked while a file is opened by a user, updated access rules will not apply to the current user. Key takeaways for this principle are:
Principle 4. Open DesignThis principle reflects the earlier tenets advising that security should not depend on the secrecy of the design or the implementation. Kerckhoffs’ Principle (1883) as well as Shannon’s Maxim (1948) suggest sharing the design publicly for increasing the chances of detecting security flaws by more eyes, but keeping the keys secrets in cryptographic systems. Quote by Claude Shannon
Read more educational and inspirational cyber quotes at our page 100+ Best Cyber Security & Hacker Quotes. The opposite of this principle is known as Security Through Obscurity and should be avoided. Key takeaways for this principle are:
Principle 5. Separation of PrivilegeA protection mechanism shall be more secure and robust if it requires two separate (or multiple) control mechanisms before granting a privilege or performing a task. Classical examples include dual keys used for crypto key controls or safety deposit
boxes. Another technical example is that Debian Linux requires a user both to know the password of the other user and to be in the Sudo group (or the Wheel group in BSD Linux) to be able to execute the
Principle 6. Least PrivilegeThis principle guides that every program and user should operate with few privileges as possible. In other terms, subjects should be given only those privileges necessary to complete their tasks, not more. Additional rights should be given as needed and removed after use. As an underlying tenet, privileges should be based on the Need-to-Know principle. Key takeaways for this principle are:
Principle 7. Least Common MechanismLeast common mechanism principle suggest not sharing system mechanisms among users or programs except when absolutely necessary. This is due to the fact that shared mechanisms can potentially lead to unintended and uncontrolled information flows among different parties. Moreover, malicious actors can gain unauthorized access and exfiltrate information through these shared mechanisms, what is also known as covert channels. Key takeaways for this principle are:
Principle 8. Psychological AcceptabilityOriginally, this principle stated that security mechanisms should not add to the difficulty of accessing a resource, so that it will be adopted naturally and exercised correctly by the users. Later, this principle was renamed to “Principle of Least Astonishment” to reflect the fact that security mechanisms will add some difficulty but it should be as minimal as possible for increased usability. C. Kaufman, R. Perlman, and M. Speciner. In “Network Security” 2nd Ed.
Read more educational and inspirational cyber quotes at our page 100+ Best Cyber Security & Hacker Quotes. This principle also states that security mechanisms must match the users’ mental models so that they can specify and use protection mechanisms correctly. To state it differently, security mechanisms should be designed in a way users can understand why the mechanisms work that way. Key takeaways for this principle are:
Principle 9. Work FactorResources required to compromise a system using brute force attacks or with trial-and-error attacks (Work Factor) could be used as an indicator to measure the security of system. However, Saltzer and Schroeder note that this principle applies imperfectly to computer systems since attackers could use indirect mechanisms, such as system failures or other logical weaknesses, to compromise the security of a system. Key takeaways for this principle are:
Principle 10. Compromise RecordingAnother principle that applies imperfectly (as noted by Saltzer and Schroeder) to computer systems is the practice of keeping records of attacks. Though compromise logs should be kept as a best practice, this approach is not perfect due two for reasons. First, compromise detection can not be guaranteed. Second, attackers can change or tamper with the compromise logs. The key takeaways for this principle are:
Though being one of the most cited works in computer security, this article is also one of the least read. If you prefer, you can read the original paper from the link: “The Protection of Information in Computer Systems“. You could also read our popular articles What is a Security Vulnerability? or What is Vulnerability Scanning? What are the 8 principles of security?List of Security Design Principles. Principle of Least Privilege.. Principle of Fail-Safe Defaults.. Principle of Economy of Mechanism.. Principle of Complete Mediation.. Principle of Open Design.. Principle of Separation of Privilege.. Principle of Least Common Mechanism.. Principle of Psychological Acceptability.. What are the design principles for security?Security by Design: 7 Application Security Principles You Need to Know. Principle of Least Privilege. ... . Principle of Separation of Duties. ... . Principle of Defense in Depth. ... . Principle of Failing Securely. ... . Principle of Open Design. ... . Principle of Avoiding Security by Obscurity. ... . Principle of Minimizing Attack Surface Area.. What are the 5 basic security principles?The Principles of Security can be classified as follows:. Confidentiality: The degree of confidentiality determines the secrecy of the information. ... . Authentication: Authentication is the mechanism to identify the user or system or the entity. ... . Integrity: ... . Non-Repudiation: ... . Access control: ... . Availability:. What are the names of security principles?Principles of Security. Confidentiality. ... . Authentication. ... . Integrity. ... . Non-repudiation. ... . Access control. ... . Availability. ... . Ethical and legal issues.. |