The security of cyber physical systems represents today a field where countries are basing their future economic growth. Despite its importance, this is a field where the asymmetry between criminals and defendants is continuously growing: dozens of new attacks with severe impacts are discovered every day, while technologies and methodologies for securing target systems struggle to advance at an adequate pace. Further research is strongly needed to improve the ability of security operators to face more effectively and timely an ever increasing mass of attacks. My research in this context is focused on the study of new approaches to support security analysis in their reverse engineering efforts. Some of the solutions I investigate are based on the usage of language based models, the we exploit to automatically identify relevant characteristics in binary code.
Stream processing
In the last few years we are witnessing a huge growth in information production. IBM claims that "every day, we create 2.5 quintillion bytes of data - so much that 90% of the data in the world today has been created in the last two years alone". This apparently unrelenting growth is a consequence of several factors including the pervasiveness of social networks, the smartphone market success, the shift toward an “Internet of things” and the consequent widespread deployment of sensor networks. Big Data applications are typically characterized by the three V's: large volumes (up to petabytes) at a high velocity (intense data streams that must be analyzed in quasi real-time) with extreme variety (mix of structured and unstructured data). These large datasets are typically analyzed using either a batch approach (using well-known frameworks like Apache Hadoop) or with stream processing. This latter approach focussed on representing data as a real-time flow of events proved to be particularly advantageous for all those applications where data is continuously produced and must be analyzed on the fly. Complex event processing engines are used to apply complex detection and aggregation rules on intense data streams and output, as a result, new events. My research in this context is focussed in studying novel solutions for increasing the scalability and efficiency of stream processing systems as well as improving their reliability to faults.
© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma