Due to lack of time, this blog is not receiving regular updates.
You may find most recent research and event organization info on one or more of my professional profiles at:

Posted in Uncategorized | Leave a comment

Call For Book Chapters


Editor: Francesco Flammini, Chairman, IEEE SMC Technical Committee on Homeland Security


Title Information

Book title: Resilience of Cyber-Physical Systems
Subtitle: From Risk Modelling to Threat Counteraction Series title: Advanced Sciences and Technologies for Security Applications

Submission Information and Important Dates

  • Chapter submission deadline EXTEDED to: August 30th 2017
  • Notification of acceptance: October 30th 2017
  • Submission of final chapters: November 30th 2017
  • Estimated manuscript publication date: Spring 2018

Electronic submission website: https://easychair.org/conferences/?conf=rcps2018
Templates and guidelines: https://www.springer.com/gp/authors-editors/book-authors-editors/book-manuscript-guidelines


The Resilience of Cyber-Physical Systems (CPS) is a highly cross-discipline topic with many challenges and open issues, including how to master the complexity and heterogeneity of crucial infrastructures and of their growing cyber-physical threats.

The purpose of this book is to address the most recent developments in ensuring the convergence between Cyber and Physical Security by providing integrated, holistic and cohesive approaches to CPS design, evaluation and testing in real industrial applications, including Internet of Things, Intelligent Transportation Systems, Smart-Cities and Factory 4.0. Emerging paradigms and technologies are presented from two antithetic points of view: their exposure to new threats and their potential to counteract them, hence smoothly moving from risk modelling to threat management and mitigation.

Such an approach is aimed at effectively supporting the readers, including graduate students, researchers and industry practitioners, in evaluating and ensuring the resilience of CPS they are developing or analysing.

Topic Coverage

Chapters must be original work not published or submitted for publication elsewhere. Revised and extended versions of published materials could be acceptable provided that they do not violate the copyright, the necessary credits are given and the required permissions granted.

The list of key topics to be addressed in book chapters includes but it is not limited to:

  • Convergence between Cyber and Physical Security
  • Integrated, holistic and cohesive approaches to resilient CPS design, evaluation and test
  • Physical/Logical Security Information/Event Management systems (PSIM and SIEM)
  • Cyber-security of Industrial Control Systems (ICS)
  • CPS resilience models, metrics, middleware, and real-time indicators
  • Emerging Cloud Computing and Internet-of-Things (IoT) security issues
  • Advanced sensing and detecting technologies for CPS
  • Threat, Vulnerability and Risk Assessment for CPS
  • Interdependency analysis of critical infrastructures as cyber-physical systems-of-systems
  • Socio-economic, procedural, privacy-related and human factors in CPS
  • Attack/Penetration testing and other simulation techniques for CPS security evaluation
  • CPS intrusion detection and prevention systems
  • Business continuity planning, incident response and emergency/crisis management
  • Applications, case-studies and industrial experience reports in CPS domains including intelligent transport systems, wireless sensor networks, drones, smart-cities and smart-factories

Due to the complexity of CPS, a comprehensive list of topics is almost impossible to sketch, and those topics are rapidly evolving or being specialized as new technologies are introduced and new threats are discovered. The complexity due to systems’ size, distribution and heterogeneity is at the same time an obstacle and a stimulating challenge for research and engineering initiatives.

From a methodological viewpoint, many areas related to design-for-security and Model Driven Engineering (MDE) still need to be extensively explored in their multi-faceted potentialities, currently representing a research niche. From the technology viewpoint, artificial intelligence paves the way to novel scenarios in which they are increasingly adopted and integrated.

A vision of the future of CPS in the context of Homeland Security cannot leave out of consideration the political strategy to find a balance between security technologies and other issues, like privacy, procedures and regulations.

The growing interest and relevance in Homeland Security is witnessed by the efforts that are being carried out at all levels to push, sponsor and fund related investments, research and innovation, like the EU’s Programme for European Critical Infrastructure Protection (EPCIP) and the EU Horizon 2020 initiative on Secure Societies.

In conclusions, CPS are playing a central role in Homeland Security; therefore, it is nowadays essential to evaluate how emerging paradigms and the most current research developments, like big-data analytics, information fusion, early warning and automatic situation assessment, could help improving the Resilience of CPS.

Tentative/Non-Binding Table of Contents



Part I: Risk Modelling in Cyber-Physical Systems

  1. Cyber-Physical Security Risk Modelling: Definitions, Methodologies, Metrics and Tools
  2. Quantitative and Stochastic Performability Evaluation in Embedded Systems
  3. Model-Based Threat/Vulnerability Assessment and Penetration Testing
  4. Holistic Multi-Level and Multi-Paradigm Modelling for Complex Systems-of-Systems
  5. Business Continuity and Contingency Planning for Critical Infrastructures

Part II: Threat Counteraction in Cyber-Physical Systems

  1. Cyber-Security Organization: IAM, SOC and CERT
  2. Systems for Early Warning and Situation Assessment: PSIM and SIEM
  3. Identification and Detection of Cyber-Physical Threats and Strategic Attacks
  4. Smart/Intelligent Firewalls, Intrusion Prevention and Detection Systems
  5. Procedures, Methods and Technologies for Emergency Preparedness, Incident Response and Crisis Management

Intended Audience

M.D. and Ph.D. students in Computer Science and Engineering who need an additional and up-to-date source providing the relevant state-of-the-art for their project works and research activities on resilient CPS, computer dependability and critical infrastructure security.

Industry researchers and engineers from all domains, especially the ones addressing business-critical and safety-critical applications, requiring methodologies, technologies and tools to help them to develop and certify resilient CPS for infrastructure monitoring and control.

Posted in Computer e Internet, Ricerca, Work | Leave a comment

Nominated Chairman of the IEEE Systems, Man & Cybernetics (SMC) Society – Technical Committee on Homeland Security



Posted in Uncategorized | Leave a comment

nSHIELD meeting in Florence: group picture

nSHIELD meeting in Florence: group picture

Posted in Uncategorized | Tagged | 1 Comment

Dal vinile ai valvolari – Il fascino del vintage [guest post]

Il vissuto da sempre affascina. Certo il moderno é bello, l’ultima moda fa sempre tendenza, ma poi quando apriamo il baule dei ricordi, sia esso fisico o immaginario, non possiamo non rimanere incantati, anzi, quasi paralizzati dalla catena di emozioni che proviamo.

E così quando prendiamo in mano un disco in vinile é come viaggiare nel tempo, agli anni del boom economico, al tempo in cui le cose erano più semplici, più reali e perché no anche più spensierate. Erano gli anni in cui si scopriva la libertà, di una scampagnata, di una gonna corta, di un bacio ascoltando la musica di un Juke Box.

E anche l’ingombro aveva un suo fascino, un 45 giri non si poteva certo stivare nel taschino di una camicia ma non aveva paragone l’esperienza di poggiare correttamente la puntina del giradischi all’inizio della traccia. Ci si sentiva come un direttore d’orchestra che alza le braccia al cielo per dare il via all‘intera orchestra.

Forse per questo in questa epoca dominata dalla hi-tech sentiamo così forte il richiamo al vinile, la riscoperta della bellezza di un qualcosa di manuale come estrarre il disco, poggiarlo sul giradischi e posare la puntina all’inizio della traccia.

Non si tratta semplicemente di ascoltare la musica ma di esserne parte, di diventare uno degli esecutori della stessa. Ma esecutori quasi nel senso letterario, tanto che chi aveva un minimo di conoscenza di elettronica si dilettava nel costruirsi in casa un apparecchio a valvole in grado di riprodurre i suoni.

Erano anni in cui l’inventiva faceva da padroni, complice anche un portafoglio spesso e volentieri quasi vuoto. Altro che transistor, altro che bobine. Erano gli anni delle puntine e delle valvole, di giradischi e di apparecchi valvolari.

Ed é interessante vedere come oggi, a decenni di distanza, anni in cui la tecnologia ha fatto passi da gigante giungendo laddove neppure i film di fantascienza dell’epoca erano giunti, ancora non si sia potuto affermare che tali tecnologie sono superate, roba vecchia. Non é solo questione di fascino, ma di soluzioni tecniche, di qualità del suono che tuttora resta di altissimi livelli.

Certo forse non saranno versatili come certi apparecchi moderni ma per chi ama l’eccellenza e perché no, l’abbandonarsi a qualche ricordo vissuto o semplicemente raccontato, questi strumenti assumono un rango superiore all’oggetto vintage: diventano cult.


Contributo scritto da hifiprestige.it


Posted in Uncategorized | 1 Comment

Achieved DGSA (Dangerous Goods Safety Advisor) certification…

… according to ADR 2013:


Posted in Work | Tagged , , , , | Leave a comment

Majority Voting: from Politics to Computers… and back?

Majority Voting: from Politics to Computers…

… and back?

by Francesco Flammini, IEEE Senior Member

Voting and computer dependability

I am quite sure most of the readers know well about the unreliability of first computers, when ‘bugs’ could be real insects and not programmers’ faults. Well, after many years, most computers continue to be quite unreliable, mainly due to the increasing complexity which is often not well mastered by software engineers. One may probably tolerate hang-ups, blue screens, and even wrong results when running software on personal computers without getting too angry and frustrated; however, nobody would even think to accept the risk of bugs causing accidents in brake-by-wire or any other critical control systems. That is why the latter are developed and tested in a way which is significantly different, more rigorous and time consuming, while the same effort would not be justified for non-critical systems. But there are situations in which you may still have faults regardless of how much effort you put in the software development: think for instance to cosmic radiation, which may cause bit flips in condenser-based memories, or compiler faults, which are out of your control. In those and other situations, engineers rely upon redundancy, that is the use of more modules performing the same task, and diversity, that is the differentiation of programmers and development tools in order to avoid the same faults to show up in different modules. Redundancy can be spatial, with modules operating in parallel, or temporal, with modules operating sequentially. In any case, the output of modules in compared in order to check if they agree on the same results. In other words, a concept is employed which is similar to the one used in politics when an important decision has to be made by checking the opinions of different people: just a few in case they are well experienced and educated on the matter, a lot more in case there are few warranties about their knowledge and skills. Well, democratic decision making may be imitated to fuse decisions coming from different (or differently installed) sensors, processors, or any other computing devices. A basic knowledge of probability theory ensures that if:

  • A and B are different individuals called to provide an answer to a non-opinable question
  • A and B do not significantly influence each other
  • A and B are not completely ignorant about the subject

the probability that they are both wrong is (very) low, that is (much) lower that the probability of A or B being wrong singularly. It is rather intuitive that the same concept can be extended to larger populations of individuals. After all, there are few doubts about which is the most valuable help in “Who wants to be a millionnaire?”…

Formally speaking, in majority voting among M individuals, a decision is taken according to the fact that the condition represented by the following formula is satified (YES) or not (NO):


  • Xiis a boolean value representing the decision of the generic individual, which can be ‘1’ for ‘YES’ and ‘0’ for ‘NO’
  • K, M are positive integers with M > 2 and K = [M / 2] + 1 (the ‘[ ]’ operator indicates the integer part of the division)
  • wi is the weight associated to ‘reputation’ and such that

Now, majority voting is exactly the concept used in the so called N-modular redundant computer architectures, where different processors, electrically segregated and running diversely developed programs, run in parallel and their results are compared in order to reach an agreement on which output can be considered correct with a certain, quantifiable level of dependability.

Voting and people dependability

Are there any differences among reaching a consensus with majority voting in computer systems and with human beings? Well, the answer is yes: in the Web 2.0 era, the assumption that people do not influence each other seems not realistic. In fact, discussions on Facebook and other social networks have been shown to be able to relevantly bias opinions. Furthermore, in politics the answer to important questions is often not merely correct or wrong, but it is related to taking the right (i.e. most saviour) decision considering the context, the expected long term consequences as well as the well-being of the highest number of citizens. However, intuition suggests that web-driven majority voting could still provide some of the advantages mentioned above for computer systems.

First of all, let’s say that – on average – people trust computers more than they trust politicians. From an engineering point of view, perhaps the reason lies in the fact that – though coming from different parties – governments are often affected by the so called ‘common mode failures’: they tend to be made up by people sharing the same will to get a ‘return on the investment’ and featuring limited technical skills. The cost for the society of having thousands of them instead of hundreds (or tens, depending on the case) would be overly high. In fact, the costs associated to politicians tend to be quite high, and the general trend is toward reduction.

Now, a quite obvious question raises: since we do not trust so much politicians, as citizens shouldn’t we govern our countries and cities by ourselves? After all, in all those years we have raised our average level of education and developed all the enabling technologies. Unfortunately, so far it seems that e-voting is considered mostly a mean to securely substitute the traditional ballot with an electronic one. Not many socio-technical studies address the issue of distributed agreement involving a large number of heterogenous individuals as a standard mechanism to support governments in everyday decision making.

Nobody would even think of being governed by shy and solitary geniuses, due to their limited social and communicative skills; however, it is a pity people like them will never play an active role in politics. Depending on their expertise, their opinion could be essential, much more than the ones of less educated individuals. I would say their judgement should be weighted even more. Wouldn’t it be meritocracy at its essence?

I think we should go further in developing a better way for involving smart people in politics, allowing them to participate in the decision making process of local authorities and to join extended experts committees on the base of their resume. And all without the stress of elections, commuting or changing jobs. The enabling and secure ICT tools are already there or may be developed quite easily. The still open issue is how to combine and organize those tools in a way to optimise the decision making process in local and central governments, improving the quality of politics and reducing the costs for the citizens.

Call it e-democracy, e-government, e-participation or direct democracy, all the related paradigms have something to do with ensemble-based voting in decision making, which is the simplest way for achieving a reliable result out of possibly unreliable sources. Just like in safety-critical computers.


[1]  Parhami, B.: A taxonomy of voting schemes for data fusion and dependable computation. In: Reliability Engineering and System Safety, Vol. 52, No. 2, May 1996: pp. 139-151

[2]  Polikar, R.: Ensemble based systems in decision making. In: IEEE Circuits and Systems Magazine, Vol. 6, No. 3, Third Quarter 2006: pp. 21-45

[3]  Rios Insua, D., French, S. (Eds.): e-Democracy. Springer, Advances in Group Decision and Negotiation, Vol. 5, 1st Edition, 2010

[4]  Wikipedia entry on ‘Computer’ and ‘Bugs’: http://en.wikipedia.org/wiki/Computer#Bugs

Posted in Computer e Internet, Notizie e politica, Ricerca | Leave a comment

Impianto Hi-Fi: quale scegliere

Date un’occhiata qui.

Posted in Hi-Fi, Musica | Tagged , , , , , | Leave a comment

Effective Surveillance for Homeland Security: Balancing Technology and Social Issues (CRC Press / Taylor & Francis)

K13920_v3aFinally available!


Posted in Uncategorized | Leave a comment

The New SHIELD Architectural Framework on ERCIM News


Posted in Ricerca | Leave a comment