The advent of Social Networks has made both companies and public bodies tremendously exposed to the so-called Social Engineering 2.0, and thus prone to targeted cyber-attacks.
Unfortunately, there is currently no solution available on the market that allows neither the comprehensive assessment of Social Vulnerabilities nor the management and reduction of the associated risk.
DOGANA aims to fill this gap by developing a framework that delivers "aDvanced sOcial enGineering And vulNerability Assessment" . The underlying concept of DOGANA is that Social Driven Vulnerabilities Assessments (SDVAs), when regularly performed with the help of an efficient framework, help deploy effective mitigation strategies and lead to reducing the risk created by modern Social Engineering 2.0 attack techniques. Two relevant features of the proposed framework are:

- The presence of the "awareness" component within the framework as the cornerstone of the mitigation activities;
- The legal compliance by design of the whole framework, that will be ensured by a partner and a work package explicitly devoted to this task.

Moreover, the outcomes of the project are also expected to provide a solid basis to revise the insurance models for cyber-attacks related risks, thanks to the involvement of 2 strong DOGANA partners in this area of activity.


Latest from our Social Engineering Blog


f(human): ENISA’s report on Cyber Security Culture and the human (f)actor

Written by Danaja Fabčič Povše, KU LEUVEN

Social engineering is on the rise, and organisations need to respond to it appropriately. The human is usually the weakest (f)actor in maintaining security, this is why smart hackers target humans, not machines. This blogpost examines the new ENISA guidelines on Cyber Security Culture in organisations in the light of the upcoming GDPR and the NIS Directive.

Article originally written for


The report

The rise of cyber-attacks against companies and data breaches, including by the exploiting the human factor through social engineering, pose high business risks. The report cites the global economic losses of 2014 due to cybercrime to be between €325 and €500 billion, with the cost of an average individual data breach of €3.14 million. ENISA’s report brings important recommendations and best practices to address those risks.

In twelve chapters, the guidance report presents a comprehensive strategy on implementing the cybersecurity culture (CSC): from defining organisation requirements, assembling the team and motivating the employees, to monitoring compliance and assessing results. CSC is defined as ‘the knowledge, beliefs, perceptions, attitudes, assumptions, norms and values of people regarding cybersecurity and how they manifest in people’s behaviour with information technologies’ (p. 7).

The idea behind the guidelines is to ensure that CSC becomes part of an employee’s business-as-usual behaviour. However, employees should be convinced and encouraged rather than compelled to comply with policies (p. 9). Hence, CSC is positioned as an organisation-wide cybersecurity culture.

The strategy is addressed from different points of view: organisational sciences, psychology, law and cybersecurity.


Legal matters!

Regarding legal aspects, the guidelines mention:

  1. Compliance of employee monitoring (implying GDPR applicability),
  2. Liability for data breaches in the General Data Protection Regulation (GDPR, Regulation (EU) 2016/679),
  3. Security and notification requirements in the Network and Information Systems Directive (the NIS Directive, Directive (EU) 2016/1148).

Attacking humans is the easiest way of attacking a system (see for example here and here). Hence, organisations have an interest in keeping an eye on what their employees are doing by monitoring and assessing effectiveness through metrics. Such monitoring falls under the scope of the GDPR if it involves the processing of personal data. This is quite likely; personal data are stored on our work computers and phones but also on personal devices, that we use in the workplace under the bring-your-own-device policy. Employee monitoring requires an appropriate legal basis for its data processing. GDPR provides for six different legal bases for data processing, including consent. However, as discussed below, consent is unlikely to be freely given in an employment scenario. A data protection impact assessment (DPIA) is also a likely requirement, according to a recent opinion of the Article 29 Working Party.

Both the NIS Directive and the GDPR contain notification duties in case of a breach. However, the two instruments have a different scope of application, so obligations differ as well.

The GDPR applies to breaches of personal data. If personal data is compromised, the data controller has to notify the data protection authority (Art. 33), and in certain circumstances also the data subject (Art. 34 – if such a breach is likely to result in high risks to the rights and freedoms of natural persons). The report focuses on implementing CSC as a means of avoiding high fines for a breach of the GDPR.

The NIS directive, on the other hand, covers the security and notification duties of operators of essential services (the definition in Art. 5(2) applies to certain industries and depends on provision of network-based services) in the case of an incident with a significant disruptive effect on the provision of the service. In such an event, the operator must notify the competent CSIRT (computer security incident response team) or the competent authority (Art. 14 of the NIS Directive).

The report also cites policy aspects; namely, how to ensure that CSC is given a wider societal impact. It suggests the adoption of training programmes, similar to NICE (national initiative for cyber security education) in the US. Member states should adopt national CS strategies and unbinding guidelines in order to raise awareness and improve skills in different segments of the workforce. Certification is also mentioned, as part of the upcoming Regulation on ENISA, the “EU Cybersecurity Agency”, and repealing Regulation (EU) 526/2013, and on Information and Communication Technology cybersecurity certification (commonly referred to as ‘Cybersecurity Act’; draft available on European Commission’s website). Additionally, it endorses the use of standards as an organisational and technical guidance for CSC implementation.



However, apart from Sections 8.2 and 8.3 the references to legal and policy aspects are quite rare. For example, Section 5.2.1 describes the role of the legal team as ensuring ‘that all new practices contribute to the full compliance of the company with national and international legislation, including data protection’ (p. 17). This includes assessing the lawfulness of monitoring employee behaviour, and amending the employees’ contract to redefine employees’ obligations. However, this fails to take into account the recent opinion of the Article 29 Working Party (WP29) on data processing at work, which states that employees are rarely in a position to consent to their personal data being processed (see its Section 6.2). Consent is only valid and freely given in exceptional circumstances: namely, if there are no adverse consequences linked to not accepting the offer.

Generally, companies as data controllers are better off relying on ‘legitimate interests’ (Art. 6(1)f of the GDPR) as legal grounds for employee monitoring when implementing cyber-security measures. If they do so, then they must ensure that data processing is strictly necessary for a legitimate purposes they are pursuing, as well as proportional in the sense that there is no less intrusive measure available to attain the same objective. An example would be investing in prevention and training rather than monitoring. According to the WP29 opinion, before putting in place such measures, the employer should consider the amount of data really necessary, and what is the impact of the processing on the right to private life and the right to secure communications.

Nevertheless, contracts can serve as an information notice to the employees: according to Articles 13 and 14 of the GDPR, the employer has to provide the employee with certain information (the list is long, see e.g. guidance by the Information Commissioner’s Office). However, this can be done in other ways, e.g. information sheets, which bring lower transaction costs than amending contracts.

Furthermore, the report suggests the organisations conduct mock attacks on their employees in order to test their awareness. If executed correctly, this can indeed improve resilience, but if something goes wrong, the trust between an employer and employee can be seriously damaged. The ethical implications of fake phishing attempts should have been at least mentioned, if not addressed, within the guidelines.

To conclude, the report brings an important contribution to improving company security. It clarifies certain issues regarding the NIS Directive and the GDPR and suggests how to adapt the two instruments into business practices. Certain questions regarding employees’ rights to privacy and data protection remain inadequately addressed, and will have to be resolved by using other sources, such as WP29’s opinions and the legal state of the art.

This blogpost is funded by the European Commission under Grant Agreements no. 740712 (COMPACT) and no. 653618 (DOGANA).


by Danaja Fabčič Povše (KU LEUVEN)


This project has received funding from the European Union’s Horizon 2020 Research and Innovation programme, under grant agreement No. 653618




The DOGANA phishing videogame

Want to try it?
Read more here and contact us


Phishing: awareness through play

Want to try it?
Read more here and contact us


Contraband pixels and texts
Read all about our liteary-graphic competition on phishing and social engineering

All the pictures and novels