The Case for Regulating Quality within Computer Security Applications

The Case for Regulating Quality within Computer Security Applications

Roksana Moore [1]

Cite as: Moore R., "The Case for Regulating Quality within Computer Security Applications", in European Journal of Law and Technology, Vol. 4, No. 3, 2013.


Computer security applications (CSAs) are essential for ensuring information security across insecure mediums such as the Internet, however despite the reliance placed upon them empirical evidence and case studies indicate that they suffer similar quality concerns to the broader software industry. This paper identifies two key reasons for this. The first is that private law and compensation are unable in their current form to raise quality within CSAs. The second is that the public good characteristics of CSAs, and the negative network externalities that defective CSAs exhibit, are addressable through regulation only. There are two types of defect, those that are known to the CSA vendor and those that are not. This paper therefore proposes a two stage-approach towards addressing these. The first stage is to raise the benchmark of CSA quality by mandating the use of standardised software engineering methodologies through use of the European standardisation framework. Thereby ensuring that CSAs are released to the market without known defects. The second stage is to mandate the disclosure of exploitable defects identified post software release by leveraging the proposed European Network and Information Security Directive.

1. Introduction

Information security is the safeguarding of the confidentiality, integrity, and availability of information. As societies become increasingly information driven, their reliance on technology for the security of that information increases also. The Internet has rapidly become established as both a storage and transport medium for growing volumes of information. It is now widely considered one of the greatest technological 'inventions' of modern history. Today, the Internet has become a fundamental economic pillar enabling a wide variety of new business models worldwide. Unfortunately it has evolved around interoperability and resiliency, as opposed to security. This is due largely to its organic development, and the rapid evolution of commercial, industrial and social demands that have been placed upon it. The result is that the Internet is one of the biggest threats to information security, [2] due both to its design, and because poor software quality creates exploitable vulnerabilities and defects within the software both powering and using it. [3] In the absence of immediate plans to address this by changing the open nature of the Internet, [4] and with poor software quality entrenched within the software industry, the use of computer security applications (CSAs) to provide security over insecure environments has grown. [5] This reliance has now extended and is beginning to be recognised within legislative requirements - for instance data protection. [6] Although both network and software based security is used on the Internet, the scope of this paper is limited to the latter - i.e. software defects and vulnerabilities. This is because network based security is generally provided in hardware, often as a service by Internet Service Providers. Whilst software does comprise an element of it, substantially different legal issues will be raised with regards to defects, and it must therefore be addressed separately.

To operate CSAs require high privilege levels within computing environments, making them high value targets to potential attackers. Despite the reliance placed upon them, CSAs remain subject to similar (or more significant) defects and vulnerabilities as other software types. [7] There are two main categories of defect, those that are known to the software vendor prior to the software release (patent), and those that were unknown but later identified (latent). [8] Both jeopardise the ability of CSAs to adequately maintain information security. [9] The risks associated with this grow with increased Internet - and thereby CSA - usage. [10] Increased reliance on technology is also a problem. For example advancements in computer science to close the gap between computing and humans, [11] with human-computer dependencies now emerging. [12] A future whereby computers could be contained within human autonomies is therefore foreseeable, [13] with the Internet potentially the communication medium of choice due to its resiliency and availability. Ensuring the quality of these systems and devices is paramount, as is the quality of the software used to secure it. Increasing quality within CSAs (and software in general) must therefore begin now.

Software vendors also do not report the extent of CSA quality related incidents. This is one issue addressed in this paper, as such disclosures are currently voluntary. Still, numerous incidents have been publically identified. For example in November 2012 several vulnerabilities within a leading antivirus product were publicised after the software vendor had been given sufficient time to address them beforehand to reduce the risk of exploitation. [14] This incident also highlighted a delay of several weeks between the software vendor receiving notification of the vulnerabilities and the issuance of an update to fix them. During this period users would continue to buy and use this software. Whilst publicising vulnerabilities gives users the opportunity to manage their risks, it also informs attackers and thereby increases the risk of exploitation. A further issue of CSA quality was highlighted in February 2012. Elements of code stolen from a CSA software vendor were made public after attempts to extort money failed. Following publication it surfaced that elements of the code, dating back to 2006, remained in use within the vendors current CSA offerings. Vulnerabilities could therefore be identified and exploited - prompting a recommendation from the vendor to disable affected software components. [15] Despite being just two examples of publicised vulnerabilities, many more are addressed without user awareness.

Defining CSAs is a critical component of this paper, with security best delivered wherever processing takes place. For example where code is executed, data is accessed and modified, and users interact. The CSA ecosystem has typically focused on these areas using obvious candidates such as anti-malware, intrusion prevention and firewall software. The rising popularity of Software as a Service (SaaS), and the proliferation of mobile devices, has added new points of transaction. This has caused the CSA ecosystem to become increasingly complex and fragmented, in particular as 'CSA functionality' is wrapped into services provided by `non-CSA vendors`. [16] Regulatory scope must therefore extend beyond traditional CSA vendors, making the definition of CSAs a critical component of this paper. On that basis CSAs are defined as: generally available software that does not provide direct productivity or entertainment value, but which provide a service or function intended solely to uphold some or all of the general principles of information security - irrespective of whether the CSA is standalone or incorporated within other software such as an operating system . The key to the definition is identifying that CSAs as not just traditional 'security only' products, but include any software that encompass significant CSA functionality that at least partially support information security. This definition would therefore extend the regulation to SaaS providers and hardware manufacturers that integrate CSA functionality into their software, or the software accompanying their hardware, despite security not being their primary business function.

The argument that poor software quality is endemic across the entire software industry has already been established in a parallel publication. [17] It was demonstrated that this derives from three fundamental reasons: failure of the software industry to self-regulate quality, the presence of information asymmetry prohibiting users from establishing software quality, and the absence of economic deterrence resulting from restrictions on users abilities to exercise their consumer rights. One solution to software quality, demonstrated through empirical research, was put forwards. This being that the methodologies applied during the software engineering process can substantially affect software quality. Furthermore the impact on quality was demonstrated to vary substantially between methodologies. The role of standardisation in raising software quality was also proposed. One use was the standardisation of software engineering methodologies. The second was to use those standards as a benchmark against which software vendors could be measured when issues of liability for 'substandard software quality' arise. This has advantages over attempts to address quality issues through legislation - an approach that has failed to date.

Despite the benefits of standardisation when establishing liability for software quality, a reliance on private law remains. The timeline for improving software quality is therefore largely dictated by the law and by market reaction to instances of liability. Given that neither of these can be predicted, the success of this self-regulated approach to the use of standardisation is equally uncertain. Whilst supporting self-regulation of the software industry is a good starting point for most software types this paper has already argued that CSAs are critical for maintaining information security. On that basis CSAs quality must be addressed differently to other software types.

This paper recommends a direct regulatory approach to CSA quality with a deterministic output, with two main points supporting this approach. The first is identification of the negative externalities and public good characteristics of information security that prevent quality from being effectively addressed outside of regulation. The second (as noted) is the inability of private law to achieve the required goals within an acceptable timeframe. Whilst legal review has been conducted into this area, [18] it has focused broadly on the information communication technology (ICT) industry and the legal certainty required from private law. In contrast this paper is primarily concerned with CSAs and direct regulatory initiatives. Given the borderless nature of the Internet and software, and the national (or state) centric implementation of laws and regulation, there are substantial challenges for consistency and meaningful enforcement. On that basis the proposals within this paper are centric to the European Union.

Finally it is important to differentiate between liability for poor software quality and liability (or responsibility) for security per se. This paper argues for the former only. If responsibility for security is to remain with users using industry provided tools, [19] then that industry must have responsibility for them, ergo guaranteed quality assurance for CSAs.

2. The Trappings of Private Law And Compensation

Within a broader information security context CSAs are one of the few components that have largely been left to private law to manage. Most others have been addressed proactively through legislation. [20] Whilst private law and compensation are capable of raising quality through economic deterrence, [21] the software industry enjoys limited liability for defects and thus the effectiveness of private law and compensation is significantly reduced. The costs of addressing quality concerns are also not a barrier. This is because the somewhat unique business models of the software industry provide the ability to modify software post release. This avoids the costly recalls encountered in most other industries, with a principle reason for this deriving from network economics as discussed in Section 3. [22]

Regulation through private law works on the assumption software vendors are held liable for defects and adequate compensation for loss or damage is exacted. If true then when the cost of defects exceed the gains of releasing poor quality software then it is likely that either an upwards shift in quality would result, or the costs would be passed onto users through increased licensing costs. Unfortunately as noted in the previous section, two points must be made: (a) civil liability doesn't work in the law's current form, and (b) even if it did, a culture shift towards litigating is required, so as to force the computer security industry to adopt a culture of quality.

2.1 Quality Assurance through Private Law

The ambiguities surrounding software liability have been widely discussed within academic literature, [23] with software licensing and classification arguably the primary reasons why private law struggles to deter poor software quality through suits. Furthermore previous research highlights that even if software-licensing issues were satisfactorily addressed, the intangible and technical nature of software creates further barriers. [24] For instance the classification of software as a good or service is necessary to determine the applicable legislation. [25] Currently the whole concept surrounding licensing - known in the industry as End User License Agreements (EULAs) - has been and remains technically problematic. This is because EULAs meet the criteria of contracts under narrow circumstances only, [26] and because they seek to absolve the software vendor of all software related liability. Unfortunately even surmounting these challenges does not solve the problem, with several other barriers and restrictions remaining. These include: (a) the burden of proof being placed onto the user to prove that the software is defective and that it satisfies the legislative criteria for liability; (b) the legal principle of remoteness that would make many losses unrecoverable - for instance lost profits; [27] and (c) the inability to recover intangibles such as loss of privacy. The technical nature of software also presents several challenges. For example software defects do not always result in failures visible to the user, [28] and when they do they might manifest outside of the software in which they materialise. [29]

Despite awareness of the need for software liability dating back to the 1980s, [30] there remain prominent advocates to software liability even in light of the known technical and legal challenges. [31] Little has materialised in the form of regulation, [32] and current legislative reviews such as the Consumer Rights Bill fail to take the opportunity to sufficiently address the issue. [33] However that does not imply that users are without remedy. Although there is currently no precedent, consumers can theoretically claim damages. The proposed Consumer Rights Bill will ensure that consumers can seek redress if digital content causes damage or loss to the consumers device or digital content. [34] Whilst this should ideally be sufficient to produce the regulatory effects of civil liability, it is not. This is because: (a) damages are an inadequate remedy for security breaches because much of the loss is intangible or too remote to be recovered (e.g. loss of privacy, reputational damage, and even business failure); (b) consumers have no effective redress except via class actions, which are only worthwhile for serious breaches; and (c) under civil liability CSA vendors are in effect insurers of the billions of computer/Internet users worldwide, so in the event of a major failure they would be unable to afford to pay damages (i.e. the risk of liability is no incentive to take further care, because if it happens the CSA vendor is inevitably insolvent). Also, additional actions in the event of security failure (particularly notifying end users at risk) are necessary but not provided for under civil liability. Thus civil liability is a fragile and reactive regulatory tool.

Private law is an essential component to self-responsibility and individualism, however it is reliant on the empowerment of freedom of contract. The inability to correctly leverage private law renders self-regulation in this sense to be non-existent, and which is further compounded by the absence of case law and penalties from regulators. [35] Even with adequate consumer protection and means of redress the case for government regulation of CSA quality remains. Self-regulation of CSA quality would require constant evolution to challenge the industry. This would be further complicated as the industry continues to develop technically and introduce new business models. The premise of this paper being that ensuring CSA quality through more direct forms of regulation can dissolve this paradox.

2.2 Civil Liability

The previous subsection identified the shortcomings of private law in deterring poor software quality within the computer security industry. This subsection explores the impact that this has had on suits in comparison to the US.

The UK has a notable absence of suits related to software quality. Whilst there are no consumer cases, the limited number of B2B cases have largely sought to establish what a defect was, [36] or whether the software was fit for purpose. [37] Compared to the UK the US courts have been much more willing to challenge software liability, as their legal culture enables policy issues to be challenged in court more easily than in the UK. The US is significantly more suit driven, this can been seen when reviewing the Principles of the Law of Software Contracts (the Principles), a guide from the American Law Institute for the courts when dealing with software related issues, [38] and which demonstrate the willingness of the US courts to apply legal provisions flexibly. [39]

UK case law has also failed to set industry expectations, with the courts often avoiding the technicalities of software by establishing only that a beach of contract took place. [40] This absence of a culture of suits results in sporadic and fragmented case law. Whilst a growth in software liability cases is expected in the future, under current legal parameters there is no certainty and reassurance to allow litigation to change the attitudes of software vendors with regards to quality. Instead it is more likely that software vendors will accept liability as a cost of doing business rather than incentivising investment in quality assurance.

Analysis beyond the software industry shows that compensation and active suits often struggle to form sufficient deterrents within a self-regulatory context. One prominent example of this can be seen within the media industry. This in part led Lord Justice Leveson to recommend that exemplary (punitive) damages should be available as an effective tool in addition to compensation and liability for actions of breach of privacy, confidence and other similar media torts. [41]

Whilst the UK lacks a culture of applying exemplary damages in matters of private law when compensation seems inadequate, the US does not. With US courts having in general sought to achieve deterrence using exemplary damages. For example the awarding of compensation and punitive damages when a defendant failed to take action over a knowingly dangerous product, [42] the awarding of punitive damages for an accident that a manufacturer could have avoided through design changes but chose to defer, [43] and for the disregard of safety following a failure to conduct adequate testing. [44] The US uses punitive damages to go beyond basic remedy, instead signalling to the market that the identified misconduct is deemed gross and unacceptable. The question remains whether punitive damages are more effective than regulation or other appropriate methods to regulate goods that have significant public reliance. With reference to the earlier point that CSA vendors are in effect insurers of the billions of computer/Internet users worldwide, then any major incident would bankrupt the vendor as opposed to deterring further bad practice. Thereby undermining how effective this approach might be in a CSA context.

Given the substantial differences in legal culture between the UK and US on this topic, a useful insight into US legal culture is provided by Kagan. With consideration to the drawbacks of a compensation culture, Kagan describes adversarial legalism as the nature, culture and unique identity of the American legal system, one that highlights the burdens and concerns of a 'liability society'. [45] Whilst there is no definitive definition of the term, adversarial legalism describes methods of policy implementation and dispute resolution that Kagan believes makes American law suit driven, punitive, political and inconsistent. [46] It implies that the US is more likely to address significant practical and policy issues through the courts rather than government, which Kagan contrasts to European social liberalism. [47] The American system of adversarial legalism is an aspect of classic free market liberalism, the alternative being not the absence of law, but a centralised 'bureaucratic legalism'. [48] As such there are demonstrable differences between the regulatory and general legal approach of the UK and US. The UK has its interest in maintaining a statist welfare system, typically employing detailed, judicially enforceable legal regulations and procedures. The US on the other hand is far more individualistic. Requiring greater flexibility to empower individualism, [49] it is more likely to rely on markets rather than systematically alter the outcome of market forces.

Zedner observes, with regard to security, huge social differences between European and US approaches, and whilst his observation centers on physical security, many of the social differences extend to computer security also. The US attitude to regulating security is also very individualistic, with Zedner noting that poor US social welfare and economic conditions often find security expressed through mass gun ownership. [50] Since information security per se embodies significant public interest, the lack of government involvement in the US reduces public representation at the cost of shifting the balance towards the market, which understandably places its own commercial interests first. Whilst this approach gives the US a competitive market edge, it places it in a weaker position than Europe within some policy areas - for instance consumer protection, electronic contracting and data protection. [51] Historically this has caused some set backs. For example the Platform for Privacy Preferences (P3P), [52] introduced by the W3C in an attempt to capitalise upon its market strength, was deemed unsuitable in achieving its aim due to lack of consumer confidence deriving from the absence of sufficient consumer protection policies supporting the platform. [53] Information security and privacy represent an area of continual importance to Europe and provide a strong example of how it is not only supporting the Single Market, but also the Single 'Electronic' Market. [54]

The dangers of private law regulation should not be overlooked. When relying on liability to regulate, the market not only has the option to challenge the law in court, but to "gatekeep" to prevent precedent being set. Evidence of this was highlighted by Kuwahara, noting that the biggest risk to large technology multinationals - in this instance Microsoft - is not exposure to liability, but the drop in share price if held liable. Hence ' Microsoft will settle any security related case prior to trial because Microsoft's board prefers to avoid the possibility of a precedent setting adverse final judgment. ' [55] Adversarial legalism empowers the market and individuals, and in the case of software liability it has come at the detriment of consumer welfare.

Whist not a direct CSA example, evidence of US regulation in operation was seen in the FTC lawsuit against HTC America. [56] The FTC imposed heavy restrictions following failures by HTC America to produce software for its mobile devices with sufficient consideration to security. These considerations included failures to adequately train engineers in security, insufficient security related testing, and failures to adopt appropriate software engineering methodologies and create processes to manage third party vulnerability reports. As the FTC does not have the power to fine, the settlement requires HTC America to patch known security vulnerabilities, introduce security risk assessment during development, and undergo independent security auditing. Whilst this case will undoubtedly change the attitude and culture of HTC towards security, it is an ex post approach reliant upon adversarial legalism to regulate. Moreover it is the premise of this paper that leaving the market to self-regulate and relying on suits when that level of self-regulation is deemed inadequate is neither the best way of supporting the market nor protecting security. As further explored below there are many other ways in which policy and regulations can be pursued, with a further premise of this paper being that a proactive collaborative approach to security regulation facilitates vendor involvement in developing a quality culture, and is more likely to lead to compliance in a manner that facilitates the market and protects security accordingly.

The US approach to regulating technology is not compatible with that of the UK or Europe. However, should the US prove successful in regulating through adversarial legalism, then it will ultimately get exported globally along with the 'output' of that regulatory process. Given the maturity of software and software liability in the US, seeking indirect regulation through adversarial legalism has thus far proven ineffective for US consumers.

3. Economic Justifications for Regulation

The prevalence of vulnerabilities and defects within the output of software industry is an artefact of market failure. Failure that results in technical and economic failings, and which is a common driver behind regulation. [57] The technical aspects of market failure have been broadly discussed in a parallel publication referenced throughout this text and will not be repeated here. [58] The economic failings have also been largely discussed within academic literature, but given the relevancy to this section will be highlighted. The most common economic failings relate to the intangible nature of software, its near zero-margin costs, information asymmetry preventing users from ascertaining software quality, and the ability for software defects to be addressed post release - facilitating the 'ship first patch later' mentality that the software industry enjoys. [59] This latter point is often deemed most responsible for poor software quality. Whilst the wilful ignorance of defects may take place in some instances, it is more likely that software vendors are unaware of their presence until after the software is released onto the market. [60]

Rather than expanding upon these economic arguments, this section seeks to draw attention to two additional ones from which further support for the mandatory regulation of CSA software quality can be derived. The first is the public good characteristics of information security and CSAs, and the second is the negative network externalities that they exhibit. Within an economic context public goods are classically defined as those whose consumption does not result in there being less of that good available for others (i.e. non-rival), and whereby consumption of that good is non-excludable; [61] with military and policing being two classic examples. [62] Network externalities are instances whereby actions have economic consequences for others for which no compensation is available. [63] Network externalities are therefore economically unrecoverable and the uninvited impact, whether positive or not, that an activity has on an uninvolved party.

3.1 Private Security and Public Goods

The relationship between information security and public goods is unclear and somewhat debateable. Hence any discussion concerning the economic justifications for regulating CSA quality must consider whether CSAs should be left to the market (i.e. to private actors) or whether they should be publically controlled and managed. If information security, and therefore CSAs, is deemed a public good, then the traditional notion is that public goods - and therefore CSAs - must be produced through public means.

There are two principle views on this topic. The first is expressed by Loader and Walker, whom comment that, 'security is a valuable public good, a constitutive ingredient of the good society, and that the democratic state has a necessary and virtuous role to play in the production of that good'. [64] Whilst supporting the argument that security per se is a public good, this traditional view has also been challenged. [65] Alun notes that "our sense of 'publicness' is contingent on the market or technology not being able to provide 'law and order' as a private good". [66] Private provision could therefore replace public provision, or exist in harmony. [67] With private security not necessarily replacing public security, but instead existing in parallel given that it addresses areas largely beyond government within both a technical and economic sense. [68] A shift in efficiency has thus taken place, [69] and on the basis that public goods are typically under produced in competitive markets whereby their producer cannot recapture production costs, it is important to establish whether information security (and hence CSAs) is considered a public good. The premise therefore is not that industry should not help produce security, but rather that they should not be left alone to do so.

One approach to answering this question is to draw parallels to other forms of security - for instance national security and policing. Private security generally exists to preserve property rights rather than enforce criminal law, [70] particularly within areas of high public interest. [71] Examples that are commonplace globally include private military, private security guards, and private security companies. Many countries have regulated this private security sector. For example the UK Security Industry Authority, [72] an independent organisation responsible for regulating the UK private security industry, which reports to the government under the Private Security Industry Act 2001. [73] Hence the private sector already plays a role within some spheres of security. A more academic analysis is therefore to establish whether any public good characteristics exist within CSAs. If present these might partially explain why standards and quality in software engineering is under produced.

Determining the presence of public good characteristics within CSAs is unlikely to be absolute. Varian, in discussing the reliability of computer systems and security, highlights the dependencies of reliability upon multiple rather than singular effort. [74] Deriving from this 'multiple effort' the public good characteristics that reliability exhibits. [75] Information security is based upon what Varian describes as total effort. This is the sum effort of all individuals, and which can be likened to the concept of flood defences as described by Hirshleifer. [76] Information security, whilst not widely considered a public good per se, has similar characteristics with regards to reliability. For instance information generally passes through a chain of systems of which (akin to flood defences) the weakest link dictates the overall level of security available. Hence, like reliability, information security is reliant on total effort in order to be successful. Public good characteristics can thus be derived, as supported by Camp and Wolfram who highlight how the absence of security on one computer can adversely impact others. [77] Anderson also adds support, noting that security (in his words cybersecurity) is the responsibility of all Internet users given that the level of security on one computer influences the potential for it to be used to attack others. [78] As a derivative of the broader classification of security, and because it is not individualistic in nature, public good characteristics within information security can be found.

Having established public good characteristics within CSAs, the impact on quality can therefore be understood economically. Powel, in providing further support to the public good arguments, notes that if the costs of the security are high, the private benefits low, and the public benefits high, then firms will under-provide security on the market. Whereas if the costs are low and private benefits high, then firms will generally provide close to efficient levels of security despite some positive externalities. [79] Given that CSA vendors, and software vendors in general, are not liable for the costs of defects, then quality assurance is generally under-produced. This is particularly true within business to consumer (B2C) instances. Certainly there is also the risk that consumers themselves will under provide security by focusing largely on private benefits. [80] The privatisation of security muddies traditional concepts of "publicness". One of the main challenges in dealing with security, software, and the Internet, is the added challenge of public/private fragmentation, but in this case it provides a compelling justification for regulation to ensure quality. [81] Nevertheless, the arguments surrounding CSAs and information security as a public good is only one reason in support of regulation.

If information security is a public matter, then by associations already established so too are CSAs. Whilst mandating the use of CSAs is a separate topic that is outside the scope of this paper, [82] when CSAs are relied upon for protection, safeguarding that reliance and trust becomes important. There are many factors to trust. Whilst also beyond the scope of this paper, what is often considered trust is actually reliance. This is particularly true when situations force that reliance, as with CSAs. Trust is therefore an important and generic justification for regulation per se. [83] Fukuyama argues that trust is the key social and economic capital, with the desired effect of any regulation being to correct market failure and build trust. [84] Although law and regulation can play their part in building trust by creating transparency, choosing the regulatory tool to do so is not easy.

3.2 Network Externalities

Further to the public good argument for regulating CSA quality is that of network externalities. Established on the grounds that CSA vendors have little incentive to raise quality themselves given that poor quality generates negative network externalities. As such these have little impact on the CSA vendor itself. This subsection outlines the reasoning behind this assertion.

Network externalities are the unrecoverable consequences of another's actions, and can be both negative and positive. Common examples of negative network externalities can be found within environmental pollution. For example pollution produced by new factories must be borne by those within its vicinity. Or how the construction of a new runway generates additional noise pollution for local residents. In contrast positive network externalities include the telephone, the fax machine, the coordinated use of standards, and the Internet. All of which exhibit increased value for existing subscribers or users of the technology when adopted by another, a concept typically referred to as a positive network effect.

CSA usage creates positive network externalities for all Internet users. This is achieved by directly protecting the user and indirectly raising security for other Internet users. For instance by substantially reducing the potential for the users computer to be exploited and thereby pose a risk to others. The absence of security has the opposite effect by increasing risks to others. An assertion described by Camp and Wolfram, [85] who cite three generic ways in which computer exploitation can create such negative externalities: (a) when compromised computers are trusted by others, (b) when multiple compromised computers are used to 'attack' others, or (c) when a compromised computer is used to mask the origin of further downstream malicious activities. CSAs can inhibit, directly or otherwise, all of these methods. Despite correlations between CSAs and security, defective CSAs provide not only the opposite, but they also generate negative network effects the more widely used they are. [86] This is particularly important given the role of CSAs in maintaining information security. From a regulatory perspective the analysis of network externalities in the context of CSAs is important. This is because - like public goods - positive externalities are usually under produced within unregulated markets because their positive benefits are not realised by those creating them. Whilst negative externalities are generally overproduced because the costs deriving from them are usually not carried by those responsible for them, but instead by those affected by them. [87] As Camp and Wolfram note, this will remain so ' absent government intervention or other solutions to internalize the externalities'. [88]

A further pollution derived example supports these arguments. Jaffe et al, in addressing environmental pollution, highlight how polluters generally reap benefits whilst imposing "costs" onto others, and thus have little incentive to remedy the situation. [89] Contrasting this to CSA vendors therefore provides supporting arguments as to why the negative externalities of the software industry have not been addressed thus far. For technology Jaffe et al cite the opposite as true. They claim that technology investors usually incur costs despite creating benefits for others. Whilst CSAs are clearly beneficial in protecting information security, any benefits are dependent on software quality. Defects unknown to the CSA vendor are therefore likely to generate negative network externalities. If the defects are known, and the user still opts to use the software, then it can still be argued that the defects damage that with public good characteristics - as identified above. In the absence of sufficient incentives, government regulation could help address negative externalities. This is no different to factories "internalising" labour and material costs in order to manage them, whilst insufficient incentives to control negative externalities (for instance factory pollution) requires environmental policy to address it. [90]

CSAs vendors are not void of incentives to address quality. For instance they can have commercial and reputational impact. However the overall impact is likely to be minimal given that the problem is endemic throughout the entire industry. Hence in the absence of regulation defective CSAs will flood the market due to the negative externalities noted, and the limited threat of civil liability. On that basis arguments in favour of government regulation exist. A trend seen elsewhere within breach notification and information security policies for instance.

The economic arguments in favour of regulating CSA quality are compelling given that the market should not be left alone to provide security, and because regulation is the only way of addressing negative network externalities. The follow section therefore provides the regulatory proposal of this paper.

4. Regulatory Governance

The regulatory objective of this paper is to raise CSA quality. It has been argued that within the current self-regulatory model this must be achieved either through civil liability or market initiative. Unfortunately for the reasons outlined neither are likely to be successful, with two sets of reasoning having been provided for this. The first was that civil liability should not be relied upon for anything other than compensation. As the outcome of litigation, and the timescales involved, are both unpredictable. Secondly, it is important to recognise that the UK does not have the legal culture required to utilise private law as an indirect regulatory tool. The reason given was that the public good characteristics of CSAs infer that they will always be under-produced (in terms of quality) through private means. Whilst the negative network externalities that defective CSAs exhibit are addressable only through regulation.

Empirical research indicates that software quality can be significantly influenced by the software engineering methodologies used. With those methodologies providing the best quality gains deriving from the market, with formal standards based methodologies woefully underperforming in comparison. [91] Unfortunately despite their existence, effective quality raising methodologies are generally under utilised by software vendors. [92] The first regulatory requirement is hence to mandate the use of the best performing methodologies. Unfortunately because these currently derive from the market they are uncontrolled, and it cannot be guaranteed that they will be maintained or keep pace with developments in software engineering and CSA technologies. A greater degree of certainty is therefore required. Addressing the software engineering stage is the basis for ensuring that CSAs are released without known defects. However, software is unlikely to ever be 100% defect free, and previously unknown defects are certain to exist. On that basis the second regulatory requirement is to provide consistency in how software vendors address such defects when they are identified post release. This is particularly important given the information asymmetry surrounding software quality as previously noted. In this instance regulation will be primarily required to ensure that CSA vendors take appropriate steps to remedy defects within a timescale commensurate to the risk that they present. Whilst mandating their public disclosure has been considered, for defects within CSAs this is deemed counterproductive to the end users information security, and hence this approach should be treated with caution. For example it might permit the widespread exploitation of a defect. Instead end users require the use of regulation to ensure trust and assurance that defects identified post release are addressed in a responsible manner and within appropriate timescales.

In the broader scheme of government and commercial information security strategies, CSAs and facilitators of information security have become important and crucial ingredients. This is particularly true given the strong reliance on technological security to supplement law; i.e. data protection and breach notification legislation. [93] As such there is an extensive domain that depends on CSAs to help facilitate their regulatory obligations, and it thus seems natural that CSAs should be equally regulated. What this section proposes is therefore a regulatory starting point only, and not an end within itself. Recognising instead that when coping with rapid technological advancements within a globalised world a linear approach to regulation cannot be taken, and agile revisions will be required.

4.1 A Case For Regulation

Identifying the best regulatory model to achieve the above objectives is a challenging task, in particular given that the introduction of regulation has had demonstrably polar effects in other areas. For example the positive 'race to the top' or 'California effect' as entities scramble to become the most compliant with regulatory requirements. [94] Or as seen within the US nuclear industry, an inadvertent 'halt' in technological advancement as regulators set measurable quality requirements based on the 'best available technology at the time'. Thereby essentially stalling progression based upon 'technicalities'. [95] Hence one of the biggest challenges to regulation is ensuring that it does not become a barrier to trade nor negatively impact the market.

When choosing a regulatory model there are a number of factors that should be considered. One example of these is provided through OECD guidelines, which highlight several criteria that should be satisfied. These being that regulation should: [96]

  1. serve clearly identified policy goals, and be effective in achieving those goals;
  2. have a sound legal and empirical basis;
  3. produce benefits that justify costs, considering the distribution of effects across society and taking economic, environmental and social effects into account;
  4. minimise costs and market distortions;
  5. promote innovation through market incentives and goal-based approaches;
  6. be clear, simple, and practical for users;
  7. be consistent with other regulations and policies; and
  8. be compatible as far as possible with competition, trade and investment-facilitating principles at domestic and international levels.

Whilst some of these points are not directly relevant to the regulatory objectives of this paper, most are important. For example the policy goals of regulating CSA quality have been outlined in the introduction and have been supported through parallel research providing sound legal and empirical support. Economic effect and market impact, perhaps some of the more contentious attributes, have also been discussed. For example the public good characteristics and negative network externalities attributed to CSAs. Even under a cost/benefit analysis regulation is likely to be a cheaper alternative to information security losses and loss of user confidence. Finally regulation and policies to address CSA quality must also be in alignment with existing information security concerns - a factor featuring highly within both national UK and regional EU policy agendas. [97]

Prior to outlining the regulatory proposal, it should be reiterated that a principal regulatory problem in this context is where to draw the line when categorising software as a CSA. The concern is that technical uncertainties might arise as to what constitutes a CSA, mirroring the legal uncertainties discussed under private law. Evolutions within the software industry may also defy categorisation, whether intentional or not, in particular as 'security like' features become increasingly prevalent within non-CSA software.

4.2 Regulatory Proposal

When regulated areas become increasingly specialised, traditional approaches towards regulation get challenged. The range of actors involved in that regulatory process also becomes increasingly broad, with collaboration between private and public actors commonplace. This need for collaboration with private actors is particularly true within the software and computer security industries, which have grown organically from private roots, and hence the relevant expertise is privately held in the most part. On that basis involvement of private actors within any regulatory process attributed to this paper will be necessary to raise CSA quality. Parallel research highlighted this, with privately developed software engineering methodologies identified as being the most effective. [98]

There are a broad number of regulatory approaches and tools, [99] however this paper seeks to utilise existing tools that are already capable of addressing the information asymmetry problems attributed to CSA quality. Whilst the requirement to include private actors might be expected to narrow the scope of regulatory options, in actuality it achieves the opposite. Collaboration between public and private actors makes available a diverse range of models and tools, and is widely seen to transition regulation into what is termed regulatory governance. This being recognition of the transnational properties of diverse regulatory requirements operating in a seemingly chaotic ensemble of horizontal and vertical collaborations between actors. Regulatory governance therefore provides a stark contrast to the traditional 'command and control' approach, [100] a shift deriving from the cost and efficiency frustrations and lack of expertise of traditional government programs. [101] Noting significant evolutions in not only government action, but in the form of that action - i.e. the tools that are used to achieve it. [102]

This paper previously identified two regulatory requirements for raising CSA quality. These being: a) identification and elimination of patent defects prior to the release of the software to the market, and b) the disclosure of latent defects identified post software release. No single regulatory approach has been identified that addresses both of these requirements, and therefore this paper proposes the following two-pronged approach - as expanded upon accordingly.

The foundation for satisfying the first requirement was outlined within parallel research that demonstrated that the choice of software engineering methodology used to create software would significantly impact the quality of that software. [103] With the use of the most appropriate software engineering methodology capable of significantly raising software quality accordingly. It concluded that there was a clear need for standardisation within this area in order to provide consistency within software quality and to guide the software industry accordingly. Unfortunately statistics indicate that existing software engineering methodologies produced by formal standards bodies significantly underperform against the best performing industry-derived alternatives, [104] and furthermore there is no obligation on software vendors to utilise any particular standardised or industry derived methodology at all. On that basis this paper argues that the first regulatory requirement could utilise the existing EU standardisation framework.

Introduced under Directive 98/34/EC to address technical barriers to trade and dispel economic uncertainly, the EU standardisation framework is traditionally called the New Approach. [105] With the intended use of this framework, in this instance, being to mandate the use of appropriate software engineering methodologies for all CSAs sold on the European Single Market. However instead of imposing legal obligations only, any use of standardisation must be supported by a structure that supports the legal foundation. As discussed below, this structure includes ongoing development of software engineering methodologies and the mandatory disclosure of defects to overcome information asymmetry concerns. With the need for this structure particularly true given strong arguments for the involvement of private actors that have been made within this paper, and the fact that the most effective software engineering methodologies (in terms of quality) derive from the market. Whilst the creation of a single software quality standard might be the simplest approach, the fast pace of the ICT industry introduces challenges in maintaining its relevancy to current software engineering practices. Therefore for reasons explained below, a more agile approach capable of responding to industry developments is instead proposed.

For the second regulatory requirement this paper recommends use of the legislative framework proposed under the Network and Information Security (NIS) Directive. [106] Intended to mandate the disclosure of information security breaches, it has the aim of establishing a high common level of network and information security across the EU. Seeking to ensure trust within the European Single Market, and facilitate the operation thereof. This paper argues that the NIS Directive should be expanded to encompass the mandatory disclosure of patent defects within CSAs following standardisation of a responsible disclosure process. However limits to that disclosure to manage end user risk are proposed.

4.3 Standardising Software Engineering Methodologies

As noted above, this paper recommends use of the European standardisation framework as a tool for achieving the first regulatory objective of raising CSA quality. The primary reason for this is the ability to establish a European Norm (EN) for CSA quality, [107] and which is typically used to implement and harmonise European policy. In this context the EN would be used to mandate CSA quality by requiring that all CSAs sold on the European Single Market adhere to the quality requirements outlined within the EN - irrespective of the place of origin. This is important given that the CSA industry largely exists outside of the EU. However given the complexities of software engineering this paper highlights two key regulatory requirements for the EN. This first is that it must require all CSAs to be sold with no known defects as opposed to being 100% defect free (as this is unattainable), whilst the second is that it must not directly reference specific software engineering methodologies. [108]

With regards to the first regulatory requirement of the EN and this is because statistics indicate that defects are almost always likely to exist, [109] and hence the threat of breaching regulatory objectives would risk stifling the market. It is instead recommended that the EN prohibits the sale of CSAs with patent (known) defects, and requires that CSA vendors take defined measures to manage the risk of latent (unknown) defects. There are two steps in achieving this, both of which the EN would need to recognise. The first is mandating the use of standardised software engineering methodologies to ensure the necessary levels of quality assurance. The second is to require that CSA vendors can justify their choice of software engineering methodology.

Regarding the second regulatory requirement of the EN and this is because although the EN is effective in harmonising policy across the EU, one of the costs of achieving this is that they are slow to develop and implement. This slow pace of EN development is not directly conducive with the fast pace of the software industry, which is within a state of constant evolution. The European Commission (EC) has attempted to modernise the EU standardisation framework to address delays and better facilitate ICT standards, [110] as well as better recognise the role of private actors within the standardisation process, which can further reduce delay. [111] For example standardisation policy adopted in 2011 aimed to accelerate, simplify and modernise the European standardisation process. [112] The more recent European '2020' strategy also sought to address delays, in particular within ICT. [113] As it currently stands the EU standardisation framework remains insufficiently agile to support a recommendation that the EN should directly recognise or encompass any particular software engineering methodologies. To do so would undoubtedly create a barrier to innovation and quality. The drafting and content of the EN is therefore paramount.

For that reason this paper recommends that the EN is used in conjunction with the New Deliverables (ND) framework. [114] The ND is a series of normative and informative documents that complement the fully harmonised EN, and which can be published within greatly reduced timescales. Intended to introduce into the European system selected benefits of the market-led model, these document types included the Technical Specification (TS). A TS is designed to introduce consensus and transparency into Europe when no immediate need for an EN exists. Whilst a TS can be transformed into an EN, it is most suitable for providing guidance to the market, providing specifications on evolving technologies, or to support the development of the European market when a EN is not required, or is unfeasible. On that basis this paper identifies the TS as one way of providing guidance to the software industry on the software engineering methodologies to use - the use of which being capable of recognition by the proposed EN accordingly. This paper therefore recommends a model whereby the best performing software engineering methodologies are formalised as TSs under the ND framework, and a EN for software quality is established that mandates that CSAs are developed in accordance to one or more of these TSs.

The use of a EN also provides a further option, this being the optional use of CE marking. [115] Whilst an overview of CE marks is beyond the scope of this paper, they do serve as an attestation that the product to which they are affixed complies with appropriate legislation. [116] The responsibility for ensuring this compliance resides with the manufacturer, and in this context the presence of the mark could serve as an indication of quality to end users. Instances can arise whereby a vendor incorrectly or falsely attests conformity, whereby the penalties for misuse vary depending upon the severity of the incident and the civil and criminal laws within the relevant Member State. Noncompliant products that do not pose an immediate safety risk are usually allowed to remain on the market whilst the manufacturer is given an opportunity to address it. Failure to comply requires the product to be withdrawn from the market. Unsafe products must be removed immediately; however recall may also be a legislative requirement. Breach notification law could also complement this, as discussed in the following section. This would avoid the legal challenges identified under private law on the basis that any assessment will be intended to establish whether the software vendor correctly adhered to the TS/EN, and thereby whether the software was released with known defects. This means that it is a technical assessment that is required rather than an attempt to map existing laws onto software.

This highlights a final set of requirements. This being how to determine which software engineering methodologies should become a TS, how to ensure that existing TSs remain capable of achieving the required levels of quality assurance, and how to keep CSA vendors informed accordingly. To meet these requirements it is proposed that the European Network and Information Security Agency (ENISA) is utilised. [117] As an information security expert, ENISA operates as a central body of expertise within Europe on information security, providing expertise and advice to the EC and Member States accordingly. Its responsibilities include: [118]

  • Collecting and analysing information on risks to information security amongst others
  • Enhancing co-operation between public and private bodies within network and information security
  • Facilitating cooperation between the Commission and Member States on developing methodologies in relation to network and information security issues
  • Raising awareness of network and information security issues, promoting best practices, and seeking public/private sector synergies
  • Assisting the Commission and Member States in industry communication concerning hardware and software security-related issues
  • Tracking the development of network and information security standards
  • Advising on research into network and information security and risk prevention technologies
  • Promoting risk assessment and management solutions
  • Contributing to international collaboration and culture development regarding network and information security issues
  • Providing independent conclusions, orientations and advice on matters within its scope and objectives

It is proposed that ENISAs responsibilities be expanded accordingly to manage the use of TSs within the context noted above. It should be clarified that it is not suggested that ENISA actively develops or maintains software engineering methodologies. Instead it should coordinate industry and formal standards bodies in identifying those methodologies that should be formalised as TSs. It should also ensure (through research) that existing TSs remain capable of achieving the necessary levels of quality assurance as the software industry evolves. Finally it should maintain a list of current TSs as a guide to software vendors. One procedure for achieving this has already been outlined by the EU within Regulation No. 1025/2012, which suggests maintaining a list of standards or TSs within the Official Journal of the European Union. [119] By implementing an EN for CSA quality, and engaging ENISA to facilitate in the management of the TSs, the regulatory requirement to eliminate latent defects prior to software being released to the market will be met. On that basis the following section addresses the second regulatory objective, this being the disclosure of latent (unknown) defects.

4.4 Mandating Responsible Disclosure

The above section outlined how regulation can be used to address patent defects. However as noted previously this paper further argues for a regulatory model that will ensure that latent defects identified post release are addressed by the appropriate software vendor. On that basis this section focuses upon the proposed NIS Directive, and how it could be modified to encompass CSA vendors for this purpose.

The primary purpose of the NIS Directive is to dissolve information asymmetry associated with information security incidents and breaches by mandating their disclosure when they are discovered. Generally this requires notification to a supervisory authority, which the NIS Directive requires to be a Computer Emergency Response Team (CERT) that has responsibility for handling incidents and risks according to a well-defined process. [120] The NIS Directive was developed to support the European cyber security strategy, which has its aim in combating cyber attacks against information systems. [121] Its proposal was based on the premise that Member States cannot effectively achieve this at a national level, and that a unified approach within the EU was required. With the EC citing the need for a network and information security regulatory infrastructure due to concerns around 'security incidents, such as human mistakes, natural events, technical failures or malicious attacks.' [122] It was acknowledged that businesses lack incentives to conduct serious risk management, involving risk assessment and taking appropriate steps to ensure NIS. [123] A concern that is conducive to the economic justifications for regulation presented earlier in this paper.

Addressing these concerns the NIS Directive proposes harmonisation across the EU with respect to network and information systems security. Applying principally to information society services, it also extends to any organisation important to the EU economy or which performs an important societal function that utilises Internet or electronic communications networks and services. [124] Its principal aim is to raise the ability of the EU to protect against information security incidents and breaches, with improved protection and cooperation across all Member States being intrinsic to this. In particular given that levels of protection vary across the EU and because the nature of information security threats are borderless. [125] The proposed Directive requires all Member States to establish competent authorities for NIS, form CERTs, and adopt national NIS strategies and cooperation plans. These national authorities thereafter collectively establish an EU level network for secure and effective coordination, information exchange, and detection and response, through which Member States could counter NIS threats and incidents based upon the EU NIS cooperation plan. The model also seeks to establish a culture of risk management across all sectors, enabling specific critical sectors to perform risk assessment and appropriate mitigation to ensure NIS, and requiring the reporting of any serious incidents of compromise of their NIS to the appropriate authorities if the continuity/supply of critical goods and services is affected. Furthermore the NIS Directive recognises the importance of standardisation, [126] encouraging the use of standards to facilitate information security, and appointing CERTs with the responsibility for promoting information security standards at national levels. On that basis the proposed Directive identifies ENISA as a body that shall assist in the cooperation network, for instance in coordinating responses and circulating information on risks and incidents, as well as jointly assessing national NIS strategies. [127]

In the context of this paper the basic framework of the NIS Directive is generally supportive of the regulatory requirements. Unfortunately in its current form it is unclear whether the NIS Directive applies to CSA vendors. The uncertainty of its application to CSA vendors derives from its drafting. Primarily this is because software vendors are specifically excluded under paragraph 24 of the Directive, which excludes software and hardware developers because they 'are not providers of information society services'. However the same paragraph does extend the Directive to ' operators of critical infrastructure which rely heavily on information and communications technology and are essential to the maintenance of vital economical or societal functions '. Arguably for reasons given previously CSA vendors are a critical component of the Internet infrastructure, and are certainly essential for ensuring information security across it. However they are also software developers and thereby likely excluded under the current drafting. The failure to recognise the role of software defects is considered an oversight. So too is the broad exclusion of software developers without consideration to the CSA ecosystem. This suggests that a breach taking place as a result of a CSA defect would require the service provider to notify CERT and not the CSA vendor. This is clearly problematic as service providers are dependent on CSAs for their security, but more importantly because the CSA vendor will better understand the nature of the defect and resulting breach. A BIS consultation on the proposed NIS Directive also highlighted that the scope of the market operators bound by the Directive is unclear and ambiguous. [128] Noting stakeholder concerns that the Directive does not bind the whole supply chain, it highlights that the exclusion of software developers and hardware manufacturers cannot be supported. [129]

The Directorate General for Internal Policies explains why software and hardware manufacturers are excluded under the proposed NIS Directive. A 2013 report on Data and Security Breaches and Cyber-Security Strategies in the EU and its International Counterparts cites this as being because entities would report vulnerabilities as opposed to breach incidents. [130] Drawing distinction that it is the exploitation of these vulnerabilities that results in breaches - rather than them being breaches per se. The report takes the view that reported vulnerabilities would be zero-day, meaning that they are newly discovered and not yet addressed by either the computer security industry nor the software vendor. It states that such vulnerabilities would have potential only, but no demonstrable economic or societal harm. Furthermore, the CERT authority would be flooded with technical information about potential attacks but without substantiating evidence of impact. [131] A second point regarding regulatory purchase is also made. With the report believing that because many software (and hardware) vendors are headquartered outside of the EU, that enforcement of the disclosure requirements would be problematic, and that vendors might be forced to comply with different and potentially opposing frameworks in their home jurisdiction and the EU. Of these two points it can be argued that the first is valid for general software types only but not CSAs. This is on the basis that any CSA defect has the potential to impact the security function of the software, and hence should be disclosed. Waiting until that defect is exploited is the wrong approach if the defect is identified and can be address prior to widespread exploitation. With regards the second point, the proposed EN for CSA quality would assist given its proposed application to all CSAs sold on the European Single Market, irrespective of origin.

The response to the proposed NIS Directive has been largely one of opposition. This is due to the aforementioned exclusion of software, but also perceived overlaps with obligations for breach notification under proposed Data Protection Regulation. [132] With insufficient clarity over how the two schemes will work in tandem. [133] For some industries an overlap will exist, however not for CSA software (and hardware) vendors. To avoid confusion it has been suggested that the NIS Directive is held back until the Data Protection Regulation comes into effect. [134] Unfortunately this will delay the regulation of NIS and ultimately CSAs were they to be included. [135]

Recognising CSAs under the proposed NIS Directive is one part of the challenge; the second is how it would meaningfully address latent defects thereafter. Opinions on this topic and suggested changes to the proposed Directive were provided by Schlyter to the committee on civil liberties, justice and home affairs. [136] Raising several concerns and suggestions for amendment, these included a proposed amendment to Article 14 to bring software vendors under the proposed Directive, [137] and a separate amendment to Article 16 proposing to address some issues of liability exclusion. [138]

Although there is no proposed infrastructure to handle CSA breaches, there is an evidently strong consensus that software should not be ignored under disclosure laws. The main objective of disclosure would be to mandate the CSA vendor to take appropriate action to address the defect in a timescale commensurate to its risk. Currently there are several ways in which software vendors are alerted to latent defects. These include internal quality control checks, independent security research, and white hat hackers. [139] Whilst it is not legally binding, a gentleman's agreement on how security defects are addressed within the community exists. At one end of the scale is non-disclosure, whereby defects are not disclosed. Often this is practiced within the black hat community to enable defects to be capitalised upon. At the opposite end is full disclosure, [140] whereby the vulnerability and the code to exploit it are publicised. [141] This is based largely on the belief that without the risk of full disclosure vendors have no incentive to fix problems swiftly. [142] For CSAs this carries a greater risk to the end-user if it is exploited before the vendor is able to respond. Taking a middle ground is what is termed responsible disclosure, [143] whereby the defect and exploitation are disclosed to the appropriate software vendor, with sufficient time given to address the problem. Were this to prove unsuccessful then full disclosure might ensue. There are therefore many pros and cons of vulnerability disclosure, with the wrong choice capable of exasperating a problem and being counterproductive to end-user information security. Hence the intention is to place sufficient pressure onto the vendor to address the defect, without risking further exploitation.

Consequently breach notification for software vendors could not work in the same manner as for service providers as the cons far outweigh the pros. However breach notification in terms of responsible disclosure is still important in order to 'legalise' the existing gentleman's agreement that is in place. The NIS Directive only encourages the disclosure of serious breaches of services and the same should apply to CSAs. [144] If a software defect is discovered - irrespective of how - and it is capable of exploitation then the responsible CSA vendor and CERT should be notified. This notification should not extend to the general public for the reasons already given. The CSA vendor must then take appropriate measures to address the defect, with oversight of the process maintained by the national CERT. Given that under the proposals of this paper all CSA vendors would be obliged to have followed a TS as part of the software development, there would therefore be an auditable trail that can identify whether that vendor was likely aware of the defect prior to releasing the software. It is therefore envisaged that over time the use of a standardised approach to software quality, supported by the mandatory disclosure of defects, would establish a baseline of quality.

5. Conclusion

There is little doubt that for the foreseeable future the Internet will continue to exhibit an upwards trend in utilisation, revealing a myriad of new use cases that further connect the physical and digital. With such changes will come more diverse and sizable information security concerns, with greater reliance on CSAs a natural by-product of this trend. This alone is sufficient to warrant a focus on quality assurance as a means of developing consumer trust and overcoming the barriers of information asymmetry, in particular as consumers have no means of independently verifying the quality of the software that they use. Governments have (as referenced earlier in this paper) established security and trust as key supporting pillars of national economic growth, further demonstrating the importance of cultivating an environment of quality and consumer confidence. Hence, in light of the limitations of private law and compensation to otherwise address this issue, direct forms of regulation are identified as the only viable methods through which these assurances can be provided and the objectives met. Regulation therefore satisfies several requirements, with private law as it stands unable to adequately support end users to create deterrence by establishing liability for loss or damage resulting from defective CSAs.

The analysis of suitable regulatory tools and mechanisms in this paper is intrinsically based on the present issue of information asymmetry in the quality of CSAs. It is however key to recognise the future trajectory of technology and the threat landscape, ensuring that any selected regulatory mechanisms are sufficiently agile to adapt and nurture quality, rather than inhibiting or restricting CSAs. It is envisaged that the use of the standardisation will help achieve this.

Finally it should be noted that whilst this paper is primarily concerned with the regulation of quality within CSAs (for reasons given in the introduction) there is no reason why the proposed framework could not be expanded in the future - albeit in a modified form - to encompass the broader software industry and raise software quality in general.

[1] The University of Southampton

[2] It is long established that the Internet, and the larger information infrastructure, are not secure. See: [accessed 10.04.2013] as published in The Froehlich/Kent Encyclopaedia of Telecommunications vol. 15. Marcel Dekker, New York, 1997, pp. 231-255.

[3] In addition to adversely impacting software functionality some defects can be exploited as a means of compromising the integrity of the computing environment or data on which the defective software resides. Such exploits do not always impact software functionality, and can therefore go unnoticed by end users. If these defects enable circumvention of standard operating functions or subversion of security they can be classified as vulnerabilities, which based on conditions (such as environment and configuration) may become exploitable.

[4] For example the former Stanford University Clean Slate Program. See: [accessed 09.04.2013]

[5] A definition of CSAs is provided below.

[6] For example the seventh data protection principle, as outlined under Section 4(1)(7) Data Protection Act 1998, requires appropriate technological measures are implemented to secure data.

[7] See: Nanz, S., (Ed.): The Future of Software Engineering, Springer Berlin Heidelberg, 2011

[8] A patent defect being that which is identifiable through a thorough inspection of the software, and a latent defect being that for which a thorough inspection would not identify. This is important because in some instances consumers have the ability to inspect, and even trial, software prior to purchase, and therefore it must be established whether or not the consumer could have realistically been aware of it.

[9] Empirical research provided by Jones et al [see: Jones, C., Bonsignour, O., Subramanyam, J., (2011) The Economics of Software Quality (Addison-Wesley)], and drawn upon in Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429] is representative of a broad cross-section of the software industry (including CSAs), although it does not provide statistics per software category. On that basis it is not possible to categorically know how CSA quality compares to that of other software types from a statistical perspective.

[10] For example in 2010 there was one Internet enabled device per global capita, a figure expected to double by 2015. The amount of information exchanged is also expected to increase by 400% over the same period. Source: [accessed 09.07.2013]

[11] See: Harper, R., Rodden, T., Rogers, Y., and Sellen, A., (2008) Being Human: Human-Computer Interaction in the Year 2020 (Microsoft Research Ltd.), available at: [accessed 11.07.13]

[12] For example developments in 3D printing have the potential to enable computers to print human organs (see: [accessed 18.07.13]). Body Area Networks are also being developed to monitor vital health statistics (see: and

[13] See generally: Harper, R., Rodden, T., Rogers, Y., Sellen, A., (2008) Being Human: Human-Computer Interaction in the Year 2020 (Microsoft Research Ltd.), available at: [accessed 11.07.13]

[14] Ormandy, T., Applied attacks against Sophos Antivirus. Available at: [accessed 16.04.2013] See also: [accessed 16.04.2013]

[15] See: [accessed 16.04.2013]

[16] For example, an operating system includes security functions such as restricting access to data (permissions), providing inspection for other CSA providers, and restricting execution of unwanted code or applications. These functions undoubtedly fall within the definition of a substantial security role even if the vendor would not classically be called a security vendor based upon the main focus of their business.

[17] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry. Computer Law & Security Review, 29, (4), 413-429

[18] BIS Department for Business Innovation & Skills (July 2012) Enhancing Consumer Confidence by Clarifying Consumer Law: Consultation on the supply of goods, services and digital content, available at: [accessed 22.09.2012]

[19] For instance breach notification and security requirements under data protection legislation.

[20] For example Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data OJ L 281, and Directive 1999/93/EC of the European Parliament and of the Council of 13 December 1999 on a Community framework for electronic signatures OJ L 13

[21] See: Anderson, R., (2001) Why Information Security is Hard-An Economic Perspective. In Proceedings of the 17th Annual Computer Security Applications Conference; Anderson, R., (1994) "Why Cryptosystems Fail" in Communications of the ACM, vol 37 no 11, pp 32-40

[22] See: Shapiro, C., Varian, H., (1998) Information Rules, Harvard Business School Press, ISBN 0-87584-863-X

[23] Discussions on this topic include: Rowland, D., Kohl, U., and Charlesworth, A., Information Technology Law, 4th Edition (Routledge) pp.434-476; Rustad, M., and Koeing, T., H., (2005) The tort of Negligent Enablement of Cybercrime, Berkeley Technology Law Journal, Vol 20 no 4; Scott, M., D., (2008) Tort Liability for Vendors of Insecure Software: Has the Time Finally Come? Maryland Law Review, Vol. 62, No. 2; ed, Reed, C., and Angel, J., (2007) Computer Law, (OUP) 6th ed; Brennan, L., (2000) Why Article 2 Cannot Apply to Software Transactions, Duquesne Law review, Vol 38, No 2; Kim, N., S., (2008) The Software Licencing Dilemma, (Brigham Young University Law Review); Poyton, D., A., (2005), Dematerialized goods and liability in the electronic environment: the truth is, 'there is no spoon (box)', (International Review of Law, Computers & Technology), vol 19, iss 1, p.86

[24] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry. Computer Law & Security Review, 29, (4), 413-429

[25] Goods are addressed under the Sale of Goods Act 1979, of which Section 14 is most relevant requiring that 'goods must be of satisfactory quality', whilst services are addressed under Part 2 of the Supply of Goods and Services Act 1982 which requires that they are provided using 'reasonable care and skill'.

[26] Whilst beyond the scope of this paper, this is typically when the 'acquisition' and licensing of software takes place simultaneously, thereby satisfying the element of consideration necessary to form a contract.

[27] Hadley v. Baxendale [1854] EWHC Exch J70

[28] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry. Computer Law & Security Review, 29, (4), 413-429

[29] For example if an application makes a format string or buffer overflow vulnerability which allows access to operating system (OS) data structures such as the stack and is used as a proxy to create a new child process or to circumvent the OS. The attack and malicious behaviour can appear in an entirely different application or in the OS and it can be incredibly difficult to source the original exploit vector in the application in which the coding error was made.

[30] The Computer (Compensation for Damage) Bill 1990 evidence that legislators have been aware of these issues for some time. Bill 180, Presented by Mr Harry Cohen to the House of Commons, printed 5th of July 1990

[31] Kaner, C., (May 1998) Bad Software-Who is Liable?, Proceedings of the American Society for Quality's 52nd Annual Quality Congress.

[32] A 2007 House of Lords security report went so far as to suggest that vendor liability be further explored at a European level, but fell short of advocating a shift of liability. See: House of Lords Science and Technology Committee, Personal Internet Security, Volume I Report, HL Paper 165-I, 10 August 2007, p.38-411

[33] Draft Consumer Rights Bill 2013

[34] Under the draft CRB consumers can seek repair and replacement (S.45), price reduction (S.46), right to refund (S.47) and, under a newly added provision, compensation for damages to devices or other digital content (S.48).

[35] BIS Department for Business Innovation & Skills (July 2012) Enhancing Consumer Confidence by Clarifying Consumer Law: Consultation on the supply of goods, services and digital content, p. 141. Available at: [accessed 22.09.2012]

[36] St Albans City and District Council v International Computers Limited [1995] FSR 686 and [1997] F.S.R 251

[37] Mayor and Burgesses of the London Borough of Southwark v IBM UK Ltd [2011] EWHC 549 (TCC)

[38] See: Moore, R., (2010) Principles of the law of software contracts - the way forward? Computer Law & Security Review, 26, (4), 427 - 431

[39] Cases include: Ilan Sys., Inc. v. Netscout Serv. Level Corp. 183 F. Supp. 2d 328, 332 (D. Mass. 2002); Step-Saver Data Systems, Inc. V. Wyse Technology 939 F.2d 91(3rd Cir. 1991); ProCD v. Zeidenberg, 908 F.Supp. 640, 652 (W.Dis. Wis. 1996); Montgomery County v. Microvote Corp., 320 F.3d 440 (3d Cir. 2003) L.S. Heath & Son, Inc. v. AT&T Info. Sys., Inc., 9 F.3d 561 (7th Cir. 1993); Mesa Bus. Equip., Inc. v. Ultimate Southern California, Inc., 1991 WL 66272 (9th Cir. 1991); Step-Saver Data Sys., Inc. v. Wyse Tech., 939 F.2d 91 (3d Cir. 1991); Sierra Diesel Injection Servs, Inc. v. Burroughs Corp., 874 F.2d 653 (9th Cir. 1989); Neilson Bus. Equip. Ctr., Inc. v. Italo V. Monteleone, 524 A.2d 1172 (Del. 1987); McCrimmon v. Tandy Corp., 414 S.E.2d 15 (Ga. Ct. App. 1991); Harris v. Sulcus Computer Corp., 332 S.E.2d 660 (Ga. Ct. App. 1985); Communications Groups, Inc. v. Warner Communications, Inc., 527 N.Y.S.2d 341 (N.Y. Civ. Ct. 1988); Microsoft Corp. v. Manning, 914 26 S.W.2d 602 (Tex. Ct. App. 1995).

[40] For example Saphena Computing v Allied Collection Agencies [1995] FSR 616, and SAM Business System v Hedley & Co. [2003] 1 All ER Comm 465

[41] Recommendation 72. Leveson Enquiry, An Inquiry into the Culture, Practices and Ethics of the Press, Executive Summary, Nov 2012, p. 42. See: [accessed 10.04.2013]

[42] Liebeck v. McDonald's Restaurants, No. D-202-CV 9302419, 1995 WL 360309 (N.M. Dist.Ct. 1994)

[43] Grimshaw v. Ford Motor Company (1981) 119 Cal.App.3d 757, 174 Cal.Rptr. 348

[44] Hilliard v. A.H. Robins (1983) 148 Cal.App.3d 374

[45] Kagan, R., A., (2003) Adversarial Legalism - The American way of law, (Harvard University Press)

[46] Kagan, R., A., (2003) Adversarial Legalism - The American way of law, (Harvard University Press), p. 13

[47] Although Kagan compares US legal culture to that of the EU, which is a collection of many legal cultures, his observation is still representative in as far as the US is unique in regulation through adversarial legalism.

[48] Kagan, R., A., (2003) Adversarial Legalism - The American way of law, (Harvard University Press), p. 159-164

[49] See: Meuti, M., D., (2004) Legalistic individualism: An alternative analysis of Kagan's Adversarial legalism, Hastings International and Comparative Law Review, vol 27 winter. Meuti looks at the culture of ethics of individualism which he believes is at the root of adversarial legalism.

[50] Zedner, L., (2006) The concept of security: an agenda for comparative analysis, Legal Studies, p. 168

[51] For more details on the link between the US and European approach to standardisation see: Winn, J. K., (2005) US and EU Regulatory Competition in ICT Standardization Law and Policy, The 4th Conference on Standardization and Innovation in Information Technology, 21-23 Sept, p. 286

[52] See: [accessed 07.05.2013]

[53] See: [accessed 01.06.2013]

[54] CEN/CENELEC paper -Standardisation and the Digital Agenda, 15th April 2011. See: [accessed 20.05.2012] Section 3 Towards a better integration of fora and consortia specifications into the European Standardization System.

[55] Kuwahara, E., (2007) Torts v. Contract: Can Microsoft be held liable to home consumer for its security flaws? Southern California Law Review vol 80 1009

[56] See: [accessed 01.06.2013]

[57] See Shleifer, A., (2005) Understanding Regulation, European Financial Management, Vol. 11, No. 4, 439-451

[58] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry. Computer Law & Security Review, 29, (4), 413-429.

[59] See: Arora, A., et al., (2006) Sell First, Fix Later: Impact of Patching on Software Quality, 52 MGMT SCI. 465, 466 and Skibell, R., (2003) The Phenomenon of Insecure Software in a Security-Focused World, 8 U. FLA. J. TECH. L. & POL'Y 107, 115

[60] Although arguably this remains the result of an inadequate investment of resource in their identification - the economic efficiencies and incentives of which having been extensively researched elsewhere. For example Schechter investigates how to accelerate the testing process and make it cost effective, whilst at the same time enabling software quality to be measured. See: Schechter, S., (2002) How to Buy Better Testing. In Proceedings of the International Conference on Infrastructure Security (InfraSec '02), Edited by Davida, G., I., Frankel, Y., and Rees, O., Springer-Verlag, London, UK, UK, 73-87. Likewise Camp and Wolfram propose models for sharing the costs of vulnerabilities back onto software vendors as a deterrence against substandard quality. See: Camp, L. , J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, vol. 12

[61] Holcombe, R., G., (1997) A theory of the theory of public good, Review of Austrian Economics, vol 10, no 1 pp.1-22

[62] Holcombe, R., G., (1997) A theory of the theory of public good, Review of Austrian Economics, vol 10, no 1 pp.1-22

[63] Camp, L., J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, Vol. 12, p. 1

[64] Loader, I., and Walker, N., (2007) Civilising Security, (Cambridge), p. 7

[65] Holcombe, R., G., (1997) A theory of the theory of public good, Review of Austrian Economics, vol 10,

[66] Gibbs, H., A., (2011) Constitutional Life and Europe's Area of Freedom, Security and Justice (Applied Legal Philosophy), Ashgate, p.49

[67] Gibbs, H., A., (2011) Constitutional Life and Europe's Area of Freedom, Security and Justice (Applied Legal Philosophy), Ashgate, p.49

[68] Gibbs, H., A., (2011) Constitutional Life and Europe's Area of Freedom, Security and Justice (Applied Legal Philosophy), Ashgate, pp. 47-57

[69] Andreson, R., Bome, R., Clayton, R., and Moore, T., (2007) Security Economics and European Policy, P.1 notes that in the last Four decades there has been a shift in information security from government to industry.

[70] Loader, I., and Walker, N., (2007) Civilising Security, (Cambridge), p.22

[71] For example the role that private security should take in the protection of merchant vessels. See: Caldwell, R., G., (Oct 2012) Private Security and Armed Military Guards: Minimising State Liability in the Fight Against Maritime Piracy, RUSI Journal, Vol. 157, No. 5


[73] See: Note that a recent consultation by the Home Office into private security suggests the introduction of quality assurance and standards. See: [acccessed 01.06.2013]

[74] Varian, H., R., (2004) System Reliability & Free Riding

[75] Varian, H., R., (2004) System Reliability & Free Riding, p.1

[76] Hirshleifer, J., (1983) From Weakest-Link to Best-Shot: The Voluntary Provision of Public Goods, Public Choice, Vol. 41, No. 3, pp. 371-386

[77] Camp, L., J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, Vol. 12, p. 1

[78] Anderson, R., (2001) "Why Information Security is Hard--An Economic Perspective." In: Proceedings of the 17th Annual Computer Security Applications Conference, New Orleans, LA.

[79] Powell, B., (2005) Is Cybersecurity a Public Good? Evidence from the Financial Services Industry, 1 J. L. Econ. & Pol'y 497, p.4

[80] As Anderson notes: "While individual computer users might be happy to spend $100 on anti-virus software to protect themselves against attack, they are unlikely to spend even $1 on software to prevent their machines being used to attack Amazon or Microsoft." See: Anderson, R., (2001) Why Information Security is Hard-An Economic Perspective In Proceedings of the 17th Annual Computer Security Applications Conference, p.1

[81] It is wrong to concede that the private industry has entirely replaced public security. There are many examples of public security within an online context. These include the National Crime Agency which polices (amongst other serious crimes) cybercrime, the Internet Watch Foundation (IWF) which works to minimise the presence of 'potentially criminal' content on the Internet, Nominet which is responsible for the allocation of domain names and the resolution of disputes related to them, as well as the European Network and Information Security Agency (ENISA) which works to maintain network and information security for the EU and its Member States.

[82] An Australian parliamentary report into cybercrime recommended that Internet Service Providers (ISPs) force customers to use anti-malware and firewall software or risk disconnection. See: Hackers, Fraudsters and Botnets: Tackling the Problem of Cyber Crime, The Report of the Inquiry into Cyber Crime, House of Representatives, Standing Committee on Communications, June 2010, Canberra, Australia. Available at: [accessed 30.03.2013]

[83] Another definition of consumer welfare is given by Cseres: "In consumer law consumer welfare stands for correcting market failures in order to improve the consumer's position in market transactions. Consumer welfare is concerned with efficient transactions and cost-savings but it is also directed at social aspects of the market such as the safety and health of consumers. Consumer welfare is an economic concept with relevant socio-political and legal implications." Cseres, K., J., (2007) The Controversies of the Consumer Welfare Standard, The Competition Law Review, Volume 3 Issue 2 p. 121

[84] Trust has been characterised as the "willingness of a party to be vulnerable to the actions of another party based on the expectations that the other party will perform a particular action important to the trustor, irrespective of the ability to monitor or control that other party" Fukuyama F, (1996) Trust: The social virtues and the creation of prosperity, (Free Press). Note that a full review of the academic literature on trust is out of the scope of this paper.

[85] Camp, L., J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, Vol. 12, p. 1

[86] Gandal, N., (2008) "An Introduction to Key Themes in the Economics of Cybersecurity," in Cyber Warfare and Cyber Terrorism (IGI Global eds.)

[87] Camp, L., J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, Vol. 12, p. 3

[88] Camp, L., J., and Wolfram, C., D., (2004) Pricing Security: Vulnerabilities as Externalities. Economics of Information Security, Vol. 12, p. 3

[89] Jaffe, A., B., Newell, R., G., and Stavins, R., N., (August 2005) A tale of two market failures: Technology and environmental policy, Ecological Economics, Volume 54, Issues 2-3, 1, pp. 166-167

[90] Jaffe, A., B., Newell, R., G., and Stavins, R., N., (August 2005) A tale of two market failures: Technology and environmental policy, Ecological Economics, Volume 54, Issues 2-3, 1, p. 165

[91] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429]

[92] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429]

[93] See: Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union - COM(2013) 48 final - 7/2/2013

[94] See: Radaelli, C., M., (2004) "The Puzzle of Regulatory Competition." Journal of Public Policy 24(1): 1-23; Carruthers, B., G., & Lamoreaux, N., (2009). Regulatory Races: The Effects of Jurisdictional Competition on Regulatory Standards. Working paper, Department of Sociology, Northwestern University, Evanston, IL.

[95] Schlueter, L., L., (2005), Punitive Damages, 5th edition (LexisNexis), pp.6-7

[96] OECD Guiding Principles for Regulatory Quality and Performance, p. 3. See: [accessed 10.06.2013]

[97] Such as the objectives of area of freedom, security and justice (AFSJ) and the digital agenda. See: Interview with Neelie Kroes about Trust and Security in the Digital Agenda, For AFSJ see generally: and Digital Agenda - Pillar 3, Trust and security, and [accessed 15.06.2013]

[98] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry. Computer Law & Security Review, 29, (4), 413-429

[99] See generally: Salamon, L., M., (2002) The Tools of New Government: A Guide to the New Governance (OUP); Black, J., (2001) 'Decentring regulation" Understanding the role of regulation and private regulation in a post-regulatory world', Current Legal Problems, 54; Eberlein, B., Abbott, K, W., Black, J., Meidinger, E., and Wood, S., (2013) Transnational Business Governance Interactions: Conceptualization and Framework for Analysis, 7 Reg. & Governance

[100] Eberlein, B., Abbott, K, W., Black, J., Meidinger, E., and Wood, S., (2013) Transnational Business Governance Interactions: Conceptualization and Framework for Analysis, 7 Reg. & Governance, p.21

[101] Salamon, L., M., (2002) The Tools of New Government: A Guide to the New Governance (OUP), p.22

[102] Salamon notes that one of the main features of newer tools is that they are often indirect; i.e. that they are heavily reliant upon private actors. Salamon, L., M., (2002) The Tools of New Government: A Guide to the New Governance (OUP), p.22

[103] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429]

[104] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429]

[105] The New Approach derives from Council Resolution of 7 May 1985 on a new approach to technical harmonization and standards (85/C 136/01). This Resolution was subsequently implemented by Directive 98/34/EC of the European Parliament and of the Council of 22 June 1998 laying down a procedure for the provision of information in the field of technical standards and regulations, as amended by Council Directive 98/48/EC of 20 July 1998 of rules on Information Society services.

[106] Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union - COM(2013) 48 final - 7/2/2013

[107] A European Norm is a legally mandated standard that is binding throughout all European Member States, and which takes precedent over any conflicting national standards. Traditionally de jure standards are formal standards, meaning that they derive from official standards bodies, and which avoid the shortcomings of market/consortia standards by providing consumers with a single, comprehensive standard that offers high level of quality and consumer welfare. Any standard can become de jure as long as it is legally mandated or formally adopted. This means that whilst market/consortia standards can become de jure, it is generally formal standards that do because they meet the necessary criteria. The most successful de jure ICT standard being that of GSM (Global System for Mobile Communications) - formerly Groupe Spcial Mobile - used for mobile telephony. See: Pelkmans, J.,( 2001) The GSM standard: explaining a success story, Journal of European Public Policy, 8(3) 1 June; Bekkers, R., Verspagen, B., and Smits, J., (2002) Intellectual property rights and standardization: the case of GSM, Telecommunications Policy, 26 (3), April, pp. 171-188; and Van Eecke, P., (2006) 1st Interim Report, EU ICT Standardisation, Study on the Specific Policy Needs for ICT Standardisation, October, p. 71

[108] This technological neutrality parallels the widely criticised Electronic Signatures Directive (Directive 1999/93/EC of the European Parliament and of the Council of 13 December 1999 on a Community framework for electronic signatures. OJ L 13, 19.1.2000, p. 12-20). See also: Proposal for a Regulation of the European Parliament and of the Council on electronic identification and trust services for electronic transactions in the internal market COM/2012/0238 final - 2012/0146 (COD)

[109] Moore, R., (2013) Standardisation: a tool for addressing market failure within the software industry [Computer Law & Security Review, 29, (4), 413-429]

[110] For reference see: White Paper Modernising ICT Standardisation in the EU - The Way Forward, 3.7.2009 COM(2009) 324 final; Proposal for a Regulation of the European Parliament and of the Council on European Standardisation and amending Council Directives 89/686/EEC and 93/15/EEC and Directives 94/9/EC, 94/25/EC, 95/16/EC, 97/23/EC, 98/34/EC, 2004/22/EC, 2007/23/EC, 2009/105/EC and 2009/23/EC (COM(2011)315); and CEN/CENELEC paper - Standardisation and the Digital Agenda, 15th April 2011, Section 3 - Towards a better integration of fora and consortia specifications into the European Standardization System. Available at: [accessed 14.07.2013]

[111] Commission Decision of 28 November 2011 setting up the European multi-stakeholder platform on ICT standardisation (2011/C 349/04)

[112] Regulation (EU) No 1025/2012 of the European Parliament and of the Council of 25 October 2012 on European standardisation, amending Council Directives 89/686/EEC and 93/15/EEC and Directives 94/9/EC, 94/25/EC, 95/16/EC, 97/23/EC, 98/34/EC, 2004/22/EC, 2007/23/EC, 2009/23/EC and 2009/105/EC of the European Parliament and of the Council and repealing Council Decision 87/95/EEC and Decision No 1673/2006/EC of the European Parliament and of the Council. OJ C 376, 22.12.2011, p. 69.

[113] Commission (EC), 'A strategic vision for European standards: Moving forward to enhance and accelerate the sustainable growth of the European economy by 2020' (Communication) COM (2011) 311 final, 1 June 2011.

[114] Introduced in part as response to Council Resolution of 28 October 1999 on the role of standardisation in Europe [OJ C 141 of 19.05.2000]

[115] Details regarding the use of the CE mark are found within Council Decision 93/465/EEC of 22 July 1993 OJ L 220 concerning the modules for the various phases of the conformity assessment procedures and the rules for the affixing and use of the CE conformity marking, which are intended to be used in the technical harmonisation Directives.

[116] An example of its use is within toy safety, whereby legislation places requirements onto toys to ensure their conformance, in particular for toys aimed at specific age ranges, and the presence of the CE marking indicates that conformity. See: Directive 2009/48/EC of the European Parliament and of the Council of 18 June 2009 OJ L 170 on the safety of toys, implemented in the UK under British Standard BS EN 71.

[117] ENISA is an executive European agency that presently operates as a centre of expertise for information security and cyber security issues. ENISA was established as a result of EU Regulation No. 460/2004, and subsequently extended under Regulation (EC) No 1007/2008 and Regulation 580/2011. Regulation No. 460/2004 has recently been repealed and replaced by Regulation (EU) No 526/2013 of the European Parliament and of the Council of 21 May 2013.

[118] As outlined under Article 3 of Regulation 526/2013.

[119] Para. 34

[120] Article 7

[121] See: [accessed 10.09.2013]

[122] Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union, COM(2013) 48 final, p. 1

[123] Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union, COM(2013) 48 final, p. 3

[124] This includes entities such as cloud providers, Internet and e-commerce companies and related providers (which include search engines, social media, online retailers etc.), utility and services providers spanning energy, transportation, healthcare etc., banks and other financial institutions.

[125] Executive Summary of the Impact Assessment - SWD(2013) 31 final - 7/2/2013, pp.2-4

[126] Article 16

[127] Article 8

[128] BIS, Call for Evidence on proposed EU Directive on Network and Information Security, September 13, p.25

[129] BIS, Call for Evidence on proposed EU Directive on Network and Information Security, September 13, p.25

[130] Directorate General for Internal Policies Policy Department A: Economic and Scientific Policy Industry, Research and Energy, (2013) Data and Security Breaches and Cyber-Security Strategies in the EU and its International Counterparts, PE 507.476. p. 100

[131] Directorate General for Internal Policies Policy Department A: Economic and Scientific Policy Industry, Research and Energy, (2013) Data and Security Breaches and Cyber-Security Strategies in the EU and its International Counterparts, PE 507.476. pp. 102-105

[132] See Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) COM(2012) 11 final

[133] See Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) COM(2012) 11 final

[134] Draft Opinion of the Committee on Civil Liberties, Justice and Home Affairs for the Committee on the Internal Market and Consumer Protection on the proposal for a directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union (COM(2013)0048 - C7-0035/2013 - 2013/0027(COD)) p.3

[135] Other concerns do exist, but not all relate to the scope or practical function of the Directive. For instance it is suggested that the proposed Directive seeks to 'militarise' security in cyberspace by proposing what is tantamount to an EU equivalent to the US NSA (National Security Agency) and UK GCHQ (The Government Communications Headquarters) and which grants 'draconian' powers to ENISA and Member States in the process, This is a bold claim, and it is therefore vital that the EU addresses this by taking into consideration the intended aim of the Directive and distinguishing it from real or perceived misuse through explicit exclusions of militarisation or similar uses. See Anderson, R., (2013) ENDitorial: Questions On The Draft Directive On Cybersecurity Strategy, European Digital Rights. Available at: [accessed 16.09.2013]

[136] Draft Opinion of the Committee on Civil Liberties, Justice and Home Affairs for the Committee on the Internal Market and Consumer Protection on the proposal for a directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union (COM(2013)0048 - C7-0035/2013 - 2013/0027(COD)) p.3

[137] Software producers shall be responsible for correcting security breaches, within 24 hours of being informed for serious cases, and 72 hours for cases were the effects are unlikely to result in any significant financial loss or serious breach of privacy. Draft Opinion of the Committee on Civil Liberties, Justice and Home Affairs for the Committee on the Internal Market and Consumer Protection on the proposal for a directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union (COM(2013)0048 - C7-0035/2013 - 2013/0027(COD)), p.10

[138] 'Commercial software producers shall not be protected from "no-liability" clauses when it can be demonstrated that their products are not properly designed to handle foreseeable security threats.' Draft Opinion of the Committee on Civil Liberties, Justice and Home Affairs for the Committee on the Internal Market and Consumer Protection on the proposal for a directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union (COM(2013)0048 - C7-0035/2013 - 2013/0027(COD)), p.3

[139] A white hat hacker tests security for non-malicious reasons, and is often termed an ethical hacker. This is contrary to a black hat hacker whom does so for malicious and illegal purposes only - usually for personal gain.

[140] See Shepherd, S., A., (2003) How do we define Responsible Disclosure?

[141] One of the first full disclosure groups was Bugtraq. See [accessed 10.09.2013]

[142] See Shepherd, S., A., (2003) How do we define Responsible Disclosure? [accessed 10.07.2013]

[143] See Shepherd, S., A., (2003) How do we define Responsible Disclosure? [accessed 10.07.2013]

[144] Due to criticism over the exact type of incident that should be notified, both 'security breach' and 'data breach' have now been defined. See the guide created by Directorate General for Internal Policies Policy Department A: Economic and Scientific Policy Industry, Research and Energy, (2013) Data and Security Breaches and Cyber-Security Strategies in the EU and its International Counterparts, PE 507.476, pp. 37-40