With more data than ever being collected, transmitted and stored, security and privacy are areas of high research interest and value to diverse industries. The sensitive nature of storing and sharing private health information across organizations, for example, is an area of high interest with growing needs for security protections.
As technology proceeds to rapidly evolve and store information, security becomes a greater risk for users. Here at Case School of Engineering, we prioritize safety and innovation within our work as we venture into developing impactful outcomes.
The main areas of research focus within Security and Privacy include:
-
Computational privacy: Developing tests to show quantifying privacy risk and identifying vulnerabilities.
-
Protecting privacy: Enhancing technology, including user data shared with service providers and finding ways to enable service provision while protecting individual privacy. This involves finding ways to predict potential violations and enhance software to protect software solutions.
-
Data liability: This includes sequential data such as genomics and location information. This interdisciplinary research area engages partner researchers in the School of Medicine and School of Law. Collaborations with University Hospitals are exploring systems for secure data exchange across different hospital systems.
-
Software security: Developing automated analysis techniques to detect software vulnerabilities and malicious behaviors, such as improving mobile app security and smart contract security.
-
System security: Developing system solutions to enable intrusion detection, cyber attack investigation, and secure interoperations such as inter-blockchain communication.
Faculty who conduct research in Security and Privacy
Erman Ayday
Privacy enhancing technologies, data security and applied cryptographyy
Michael Rabinovich
Investigates solutions to improve Internet performance, scalability and security
An Wang
Leverage advanced network technologies to enhance data center system performance and security; optimize distributed machine learning system performance and investigate deep learning model security and privacy concerns.